GoogleBot to soon crawl over HTTP/2
While HTTP/2 should result in efficiency improvements, there is no ranking benefit to this chage.
Google said there is no ranking benefit to this change or to your site supporting crawling over HTTP/2.
What is HTTP/2? HTTP/2 is a major revision of the HTTP network protocol used by the World Wide Web. It was derived from the earlier experimental SPDY protocol, originally developed by Google. HTTP/2 was developed by the HTTP Working Group of the Internet Engineering Task Force.
Ilya Grigorik from Google wrote: “HTTP/2 will make our applications faster, simpler, and more robust — a rare combination — by allowing us to undo many of the HTTP/1.1 workarounds previously done within our applications and address these concerns within the transport layer itself. Even better, it also opens up a number of entirely new opportunities to optimize our applications and improve performance.”
It is more efficient. HTTP/2 (or h2 for short) is simply more efficient and that is why Google is taking these steps. Google said “we expect this change to make crawling more efficient in terms of server resource usage. With h2, Googlebot is able to open a single TCP connection to the server and efficiently transfer multiple files over it in parallel, instead of requiring multiple connections. The fewer connections open, the fewer resources the server and Googlebot have to spend on crawling.”
Starting in November 2020. Google said this process will begin with “a small number of sites ” in November 2020 and then slowly ramp up support for more and more sites. This will only be done initially for “sites that may benefit from the initially supported features, like request multiplexing,” Google said.
What if my site doesn’t support HTTP/2? That is fine, Google said. “If your server still only talks HTTP/1.1, that’s also fine,” Google wrote. Google said there is “no explicit drawback for crawling over this protocol; crawling will remain the same, quality and quantity-wise.”
No ranking benefit. No, no, no. There is no ranking benefit to HTTP/2. If Google crawls using HTTP/1.1 or HTTP/2, there is no direct ranking benefit to that, Google said.
What are the benefits. Efficiency of crawl, as we mentioned above, are the benefits. Google said these are the three primary benefits:
- Multiplexing and concurrency: Fewer TCP connections open means fewer resources spent.
- Header compression: Drastically reduced HTTP header sizes will save resources.
- Server push: This feature is not yet enabled; it’s still in the evaluation phase. It may be beneficial for rendering, but we don’t have anything specific to say about it at this point.
Does my site support HTTP/2. It may, Cloudflare has a blog post that shares how you can check your site support. Or you can ask your host and/or developer to check for you.
Opt in or out. There is no way to opt in, this is automatic and you cannot force Google to crawl you over HTTP/2. But you can opt out for now, Google said. To opt out, have your server to respond with a 421 HTTP status code when Googlebot attempts to crawl your site over h2. If that’s not feasible at the moment, you can send a message to the Googlebot team, Google said.
Will you know? Google said it may show you a message in Google Search Console when it switches to HTTP/2 crawling. Google wrote “when a site becomes eligible for crawling over h2, the owners of that site registered in Search Console will get a message saying that some of the crawling traffic may be over h2 going forward.” Google also said “you can also check in your server logs” for this as well.
Why we care. For larger sites, making crawling more efficient can be helpful to hosting budgets. It is also important to know how GoogleBot is adapting and improving over time.