Operated by Google
The specific User-Agent token used to control Google's PageSpeed Insights tool via robots.txt.
The specific User-Agent token used to control Google's PageSpeed Insights tool via robots.txt.
Google Page Speed Insights is a commercial SEO analytics crawler operated by Google. It builds backlink graphs, crawls for technical SEO issues, and tracks keyword rankings. The user-agent Google Page Speed Insights is well-known and respected in the SEO industry. Blocking it removes your domain from Google's index, preventing competitors from analysing your backlink profile via their platform. However, other Google users also lose visibility into links pointing TO your site — weigh this trade-off carefully.
<code>User-agent: Google Page Speed Insights</code> — Matching is case-insensitive. Robots.txt is fetched from the root of each subdomain separately.
Google Page Speed Insights is verifiable via reverse-DNS lookup on the crawling IP addresses. You can safely allow it unless you have a specific reason to block (e.g., AI training opt-out or SEO tool visibility).Understanding Google Page Speed Insights's purpose helps you decide whether to allow or block it.
Google Page Speed Insights. This is the exact string you must use in robots.txt, Nginx, Apache, or Cloudflare firewall rules to target this bot. User-agent matching in robots.txt is case-insensitive, but the string must be spelled correctly. You can verify that a request genuinely comes from Google Page Speed Insights by performing a reverse-DNS lookup on the source IP — legitimate bots resolve back to their operator's domain.Google Page Speed Insights is verifiable via reverse-DNS lookup on the crawling IP addresses. You can safely allow it unless you have a specific reason to block (e.g., AI training opt-out or SEO tool visibility)./robots.txt file:
User-agent: Google Page Speed Insights Disallow: /This instructs Google Page Speed Insights not to crawl any path on your site. The Disallow: / directive covers the entire domain including subfolders. To only block specific sections, replace / with the path (e.g.,
Disallow: /blog/). Note: robots.txt is publicly readable — any bot or human can inspect it at yourdomain.com/robots.txt.Google Page Speed Insights (case-insensitive grep: grep -i "Google Page Speed Insights" /var/log/nginx/access.log). You can also check Google Search Console → Coverage → Crawl Stats for Googlebot variants. For Google Page Speed Insights specifically, filter by user-agent in your log analysis tool (GoAccess, AWStats, etc.).User-agent: Google Page Speed Insights Crawl-delay: 10(10 second delay between requests).
Disallow: / you can restrict Google Page Speed Insights to specific paths:
User-agent: Google Page Speed Insights Disallow: /private/ Disallow: /staging/ Allow: /This allows Google Page Speed Insights everywhere except the listed paths. Path matching in robots.txt uses prefix matching —
Disallow: /private/ blocks /private/page.html but NOT /public/private/.Check instantly with our free AI Bot Checker
Check Your Website