Manage Google crawling pages.

What’s the best way to manage the google crawling pages needed: robots.txt and sitemap.xml ?

These are managed from your code base, it’s common for a robots.txt to be a static file and for sitemap. XML to be Auto generated by your framework or some plug-in for your framework

Got it - thank you! I’ll have a look at how to generate these. Confirming this would be outside of plasmic?

Yes!