Google Crawl-Limit
Googlebot stops parsing HTML after 2MB. Check if your page exceeds this limit and identify what's consuming the most space.
Über dieses Tool
Googlebot has a technical limit when crawling web pages: it stops parsing HTML after 2MB (2,097,152 bytes). This means if your page's HTML exceeds this limit, parts of your content won't be indexed by Google, potentially affecting your SEO performance.
How It Works
- Fetches your page's HTML as Googlebot sees it (before JavaScript execution)
- Berechnet die exakte Byte-Größe mit UTF-8-Kodierung
- Analyzes what's consuming space: inline scripts, styles, base64 images, SVG graphics, etc.
- Identifiziert die 10 größten HTML-Elemente mit spezifischen CSS-Selektoren
- Generiert umsetzbare Empfehlungen nach Priorität geordnet
- Shows external resources (scripts, CSS, images) that DON'T count toward the limit
Was zählt zum 2MB-Limit?
Zählt zum Limit
- • Inline-JavaScript (script-Tags ohne src)
- • Inline-CSS (style-Tags und style-Attribute)
- • Base64-kodierte Bilder in HTML/CSS
- • Inline-SVG-Grafiken
- • Alle Textinhalte und HTML-Struktur
- • HTML-Kommentare und Leerzeichen
Zählt NICHT
- • External JavaScript files (script src="...")
- • External CSS files (link rel="stylesheet")
- • External images (img src="...")
- • Externe Schriftarten, Videos und andere Medien
- • Per AJAX/fetch geladene Inhalte
- • Von JavaScript gerenderte Inhalte
Profi-Tipp: The best way to stay under the limit is to externalize large inline scripts and styles into separate .js and .css files. This also improves caching and page load performance!
Häufig gestellte Fragen
Google introduced this limit to prevent extremely large HTML files from consuming excessive resources during crawling. Pages larger than 2MB are rare and usually indicate optimization opportunities. Googlebot will still index the first 2MB, but content after that point may not be indexed.
Googlebot will stop parsing your HTML after 2MB, which means important content, links, or structured data after that point won't be seen or indexed. This can significantly impact your SEO. The tool identifies what's consuming space so you can optimize accordingly.
No! External JavaScript files, CSS stylesheets, and images loaded via URLs do NOT count toward the 2MB HTML limit. Only the raw HTML content itself counts. This is why externalizing inline scripts and styles is an effective optimization strategy.
Very accurate! The tool fetches your page exactly as Googlebot sees it and calculates the raw HTML size using UTF-8 encoding (the same method Google uses). It analyzes the HTML before JavaScript execution, matching Google's initial crawl behavior.
Follow the recommendations provided by the tool: externalize inline JavaScript and CSS, convert base64 images to files, optimize or remove large SVGs, remove HTML comments, and minify your HTML. These optimizations also improve page load performance!
Yes! After analyzing your page, you can export all results to CSV format including HTML size, breakdown by category, top space consumers, and recommendations. This is useful for tracking progress over time or sharing with your development team.
reCAPTCHA helps prevent automated abuse and ensures the tool is used by real users. Fetching and analyzing web pages requires server resources, so this protection maintains service quality for everyone.