Should I provide to Googlebot optimized pages for speed?
Summary:
Matt's answer:
The answer to that question is No and if I could reach through the screen and hit the ESC key and Ctrl C and Break I would because that’s cloaking. You never want to do something completely different for googlebot than you’d do for regular users. That’s the definition of cloaking. If you have something in your code, saying “If user agent equals Googlebot or if the IP address is Google, do something different”.
Doing something different for Googlebot is the mere definition of cloaking
So you might think the page load speed is a factor I should care about, so let’s make things fast for Googlebot. But that is not the right approach. Because:
- we’re not only using Googlebot to determine how fast a particular page or site loads, so it wouldn’t even work.
- if you’re changing the content that can show up people will be able to look at the cache page and see oh, it’s nothing but a text page, this is something that’s very strange and so they’ll complain about your site cloaking.
Think about it, whenever you include CSS, Java Script images, most of the time those are external, we’re not even going off to load those at that particular time. So knowing that external stuff exists, doesn’t necessarily mean that we’re going to go off and fetch it and incorporate it all into the page. But you do want to show the same page to users that you show to Googlebot. Don’t do anything to try to speed things up for Googlebot because that’s cloaking and that’s a much higher risk, then just trying to speed things up only for Googlebot.
by Matt Cutts - Google's Head of Search Quality Team