Crawler Directives

robots.txt

Directives for search engine crawlers. Ensure you are allowing AI bots to index your important content.

User-agent: *
Allow: /

# Allow AI Bots
User-agent: GPTBot
Allow: /

User-agent: Google-Extended
Allow: /

User-agent: CCBot
Allow: /

Sitemap: https://www.beautybrand.com/sitemap.xml

AI Optimization Tip

Ensure your robots.txt is accessible at the root of your domain to maximize visibility for LLM crawlers like GPTBot and Google-Extended.