Google has a lot of different tools, and while they handle massive amounts of data, even Google has its limits. Here are some of the limits you may eventually run into.
Per Google’s Search Console Help documentation, “You can add up to 1,000 properties (websites or mobile apps) to your Search Console account.”
Many of the data reports within Google Search Console are limited to 1,000 rows in the interface, but you can usually download more. That’s not true of all of the reports, however (like the HTML improvements section, which doesn’t seem to have that limit).
The limit for the number submitted is higher, but you will only be shown 200. Each of those could be an index file as well, which seems to have a display limit of 400 site maps in each. You could technically add each page of a website in its own site map file and bundle those into site map index files and be able to see the individual indexation of 80,000 pages in each property… not that I recommend this.
According to Search Engine Roundtable, this is one of the errors that you can receive when submitting a disavow file.
Google Webmaster Trends Analyst John Mueller had mentioned that there was a cutoff for the “Fetch as Google” feature, and it looks like that cutoff is 10,000 pixels, based on testing.
Once you’ve reached this limit, you’ll either be sampled or have to upgrade.
As stated on Google’s Robots.txt Specifications page, “A maximum file size may be enforced per crawler. Content which is after the maximum file size may be ignored. Google currently enforces a size limit of 500 kilobytes (KB).”
All formats limit a single sitemap to 50MB (uncompressed) and 50,000 URLs. If you have a larger file or more URLs, you will have to break it into multiple sitemaps. You can optionally create a sitemap index file (a file that points to a list of sitemaps) and submit that single index file to Google. You can submit multiple sitemaps and/or sitemap index files to Google.
While Google doesn’t have a limit, you probably shouldn’t go over Internet Explorer’s limit of 2,083 characters in the URL.
That is according to Google’s John Mueller and represents a significant jump from the 10MB limit in 2015.
While Google doesn’t have a hard limit on the number of links per page, they do recommend keeping it to “a reasonable number,” clarifying that this number is “a few thousand at most.”
Google’s John Mueller has said that Googlebot will follow up to five redirects at the same time. I don’t know if anyone has ever looked into the total number Google will follow. I did a little digging in Google Search Console and found one page still showing links as “via intermediate links” with a 10-hop chain. Yes, the original still showed in that case, but I also found some others that were cut off at six hops, even though they had more in the chain. I would say keep it to as few as you can, just in case.
It’s often recommended to keep it to 250 words, but there’s really no limit.
Fun fact: Each word is also limited to 128 characters.
While there’s not really a limit per se, this test is still live, and only the first 16 seem to count.
That’s right, one domain can take the entire page if it’s relevant enough. Just check out the example below:
Per the YouTube Help documentation:
The maximum file size that you can upload is 128 GB or 12 hours, whichever is less. We’ve changed the limits on uploads in the past, so you may see older videos that are longer than 12 hours.
You are limited to 700 keywords in Keyword Ideas. This is also the limit when uploading a file to get search volume and trends, but you can upload 3,000 keywords at a time to the forecaster.
YouTube’s counter used to be a 32-bit integer, limiting the possible video views it would show to a little over 2 billion (2,147,483,647). YouTube now uses a 64-bit integer, which can show ~9.22 quintillion views (9,223,372,036,854,775,808).
Anyone else run into other limits they’d like to share? Message me on Twitter and let me know about them!