It’s a done deal – Google has announced they are now using site speed as one of its 200 signals used in deciding search results. Apparently they have been doing it for a while for testing, and made the announcement today.
I think that the focus on the user experience is a good idea, but this may not be the best implementation, depending on how it is implemented . Who knows the weight among the 200 signals that site speed has?
A couple of things come to mind:
- There are several great gardening blogs out there, http://www.gardeningblog.net/ for example. What makes them great? Not only the writing – but all of the great pictures! So is a blog that works to show lots of timeline growth photos is going to be penalized by some degree? It sure sounds that way.
- Are blogs in general going to be punished? Think about any blog or journal out there. Most of them have a long home page – because it allows the user to scroll down and read several entries sequentially. Guess what – they will have a longer load time.
- A lot of web sites now rely on outside sources for content – eg: Youtube videos and RSS new feeds. These can also slow down speed, but overall enhance the user experience.
- Your web host – some are faster than others, and you won’t really know until you try them. I recently setup a site where I paid for a year upfront, and then learned after the fact that the web host is slower than I would like, even though they are a popular host company.
- Reliance on Google’s own tools on your site can also slow down site speed, such as Google Analytics, and Google Adsense. So, it is unclear if allowances for their own toolset is taken into account. I hope Google doesn’t hold our left hand and slaps our right hand.
I would really like to see them use speed comparisons of similiar websites, instead of possibly using a benchmark of some sort.
Here are some highlights of some comments regarding the announcement:
Navi Arora said…
“Now i have to move my websites from Shared Hosting to VPS :(”
john bishop images said…
“…My site (johnbishopimages.com) also shares resources on my webhost (bluehost.com) with other sites because this is what I can afford. Are you going to penalize sites because they can’t afford dedicated hardware resources? Kinda flies in the face of many of Google’s other initiatives!
How important will this signal play in page ranking? If it plays a major role, you are penalizing the little guy who is trying to get a start and rewarding the larger corporate sites because they can afford lots of iron and content servers spread across the countryside – not exactly a level playing field and one that, on the surface, this signal seems to perpetuate.”
“I’m removing Google Analytics code from all my sites – it’s very slow and WMT always shows it as one area of improvement. Google AdSense code often renders excruciatingly slow. I guess, it’s gotta go, too.”
“The least they can do in my opinion is to tell a webmaster if a page ranking or website ranking is affected by its page loading times.”
“Shame. Officially, Google set a ranking factor not to increase relevancy but to reduce crawling cost.”
So what are your thoughts? Overhyped? Or are ya maybe overwhelmed?