Create a custom robots.txt file to control which bots can crawl your website. Choose from presets or select individual bots from our database of 160+ crawlers.
Start with a preset and customize from there. Click a preset to apply it.
User-agent: * Allow: /
yoursite.com/robots.txt) that tells web crawlers which pages they can and cannot access. It follows the Robots Exclusion Protocol (RFC 9309). While it's voluntary (bots can choose to ignore it), all major search engines and legitimate AI crawlers respect it. Having a properly configured robots.txt is essential for controlling how bots interact with your site.robots.txt and upload it to the root directory of your website. It must be accessible at https://yoursite.com/robots.txt. For WordPress, you can use a plugin like Yoast SEO or upload via FTP. For static sites, place it in your public folder. For Cloudflare Pages, add it to your public/ directory.