AI Crawler Check
Free Bot Analysis Tool
Safe SEO Tools

Google Page Speed Insights

Operated by Google

Quick Facts

User-Agent:Google Page Speed Insights
Category:SEO Tools
Operator:Google
Safety:Safe
Blocking Impact:Varies — Evaluate before blocking
SEO Impact Score:0/10

What is Google Page Speed Insights?

The specific User-Agent token used to control Google's PageSpeed Insights tool via robots.txt.

The specific User-Agent token used to control Google's PageSpeed Insights tool via robots.txt. Google Page Speed Insights is a commercial SEO analytics crawler operated by Google. It builds backlink graphs, crawls for technical SEO issues, and tracks keyword rankings. The user-agent Google Page Speed Insights is well-known and respected in the SEO industry. Blocking it removes your domain from Google's index, preventing competitors from analysing your backlink profile via their platform. However, other Google users also lose visibility into links pointing TO your site — weigh this trade-off carefully.

What happens if you block Google Page Speed Insights?

❓ **Impact Unknown** — The SEO consequences of blocking Google Page Speed Insights are not fully documented. Before blocking, check your analytics to confirm whether this bot generates referral traffic, review your server logs for crawl frequency, and test in a staging environment if possible.
Generally safe to allow; provides legitimate crawling value.

How to block Google Page Speed Insights with robots.txt

<code>User-agent: Google Page Speed Insights</code> — Matching is case-insensitive. Robots.txt is fetched from the root of each subdomain separately.

Block completely (robots.txt)
User-agent: Google Page Speed Insights Disallow: /
Allow all (robots.txt)
User-agent: Google Page Speed Insights Allow: /
Block private only (robots.txt)
User-agent: Google Page Speed Insights Disallow: /private/ Disallow: /api/ Disallow: /admin/ Allow: /
Nginx server block
# Nginx: Hard-block Google Page Speed Insights if ($http_user_agent ~* "Google\ Page\ Speed\ Insights") { return 403 "Bot blocked"; }
Apache .htaccess
# Apache: Hard-block Google Page Speed Insights SetEnvIfNoCase User-Agent "Google\ Page\ Speed\ Insights" bad_bot Order Allow,Deny Allow from all Deny from env=bad_bot
Meta robots tag
<meta name="robots" content="noindex, nofollow">
X-Robots-Tag header
X-Robots-Tag: noindex, nofollow

Is Google Page Speed Insights safe to allow?

Yes, Google Page Speed Insights is a **safe and legitimate** crawler. It is operated by Google, which publicly documents its crawler at an official URL and follows the Robots Exclusion Protocol (RFC 9309). The user-agent string Google Page Speed Insights is verifiable via reverse-DNS lookup on the crawling IP addresses. You can safely allow it unless you have a specific reason to block (e.g., AI training opt-out or SEO tool visibility).
Verify by reverse-DNS lookup: legitimate Google Page Speed Insights requests resolve to google's domain.

What does Google Page Speed Insights do?

Understanding Google Page Speed Insights's purpose helps you decide whether to allow or block it.

Frequently Asked Questions

What is the official user-agent string for Google Page Speed Insights?
The official user-agent string for Google Page Speed Insights is: Google Page Speed Insights. This is the exact string you must use in robots.txt, Nginx, Apache, or Cloudflare firewall rules to target this bot. User-agent matching in robots.txt is case-insensitive, but the string must be spelled correctly. You can verify that a request genuinely comes from Google Page Speed Insights by performing a reverse-DNS lookup on the source IP — legitimate bots resolve back to their operator's domain.
Is Google Page Speed Insights safe?
Yes, Google Page Speed Insights is a **safe and legitimate** crawler. It is operated by Google, which publicly documents its crawler at an official URL and follows the Robots Exclusion Protocol (RFC 9309). The user-agent string Google Page Speed Insights is verifiable via reverse-DNS lookup on the crawling IP addresses. You can safely allow it unless you have a specific reason to block (e.g., AI training opt-out or SEO tool visibility).
Will blocking Google Page Speed Insights hurt my SEO?
❓ **Impact Unknown** — The SEO consequences of blocking Google Page Speed Insights are not fully documented. Before blocking, check your analytics to confirm whether this bot generates referral traffic, review your server logs for crawl frequency, and test in a staging environment if possible.
How do I block Google Page Speed Insights in robots.txt?
Add the following lines to your /robots.txt file:
User-agent: Google Page Speed Insights
Disallow: /
This instructs Google Page Speed Insights not to crawl any path on your site. The Disallow: / directive covers the entire domain including subfolders. To only block specific sections, replace / with the path (e.g., Disallow: /blog/). Note: robots.txt is publicly readable — any bot or human can inspect it at yourdomain.com/robots.txt.
Does Google Page Speed Insights respect robots.txt?
Yes — Google Page Speed Insights is a well-behaved bot operated by Google. It fetches and parses /robots.txt before crawling any page, following RFC 9309.
How do I verify if Google Page Speed Insights is crawling my site?
Search your web server access logs for the string Google Page Speed Insights (case-insensitive grep: grep -i "Google Page Speed Insights" /var/log/nginx/access.log). You can also check Google Search Console → Coverage → Crawl Stats for Googlebot variants. For Google Page Speed Insights specifically, filter by user-agent in your log analysis tool (GoAccess, AWStats, etc.).
What is the crawl frequency of Google Page Speed Insights?
Google Page Speed Insights crawls at a moderate rate. If you notice excessive traffic in your logs, you can add a Crawl-delay directive:
User-agent: Google Page Speed Insights
Crawl-delay: 10
(10 second delay between requests).
Can I block Google Page Speed Insights from specific pages only?
Yes. Instead of a global Disallow: / you can restrict Google Page Speed Insights to specific paths:
User-agent: Google Page Speed Insights
Disallow: /private/
Disallow: /staging/
Allow: /
This allows Google Page Speed Insights everywhere except the listed paths. Path matching in robots.txt uses prefix matching — Disallow: /private/ blocks /private/page.html but NOT /public/private/.
Why would I want to block Google Page Speed Insights?
There are two main reasons to block Google Page Speed Insights: 1. **Competitive intelligence**: Google users (your competitors) can analyse your backlink profile, keyword rankings, and site structure. Blocking prevents this. 2. **Crawl budget**: If your site is large, SEO tool crawlers consume crawl bandwidth without contributing to search rankings. Blocking frees up server resources for search engine bots. The trade-off: you also become invisible in Google's backlink database, meaning links pointing TO your site from other domains won't appear in competitor reports either.

Related Bots

Is Google Page Speed Insights blocked on your site?

Check instantly with our free AI Bot Checker

Check Your Website