Speed will likely soon not only matter to users expecting real-time information, but to webmasters and SEOs alike that wish to rank well in the search results. In November last year, Google guru Matt Cutts revealed that page loading time is being discussed internally as a possible ranking factor for organic results. And as Google shortly after beta-launched functionality in Webmaster Tools to analyze page loading time, it seems to be only a matter of time before being implemented. So why is Google interested in a speedier web, and what does it mean for webmasters who can’t afford a more powerful hosting solution?
Hurry — Or Lose Your Rank
From a webmaster’s perspective, having a fast site always makes sense since it will provide an overall better user experience, enticing more clicks and page views and less abandonment of shopping carts by frustrated users. From Google’s point of view, more page views equals more advertising inventory for its AdSense program, while more conversions for its Google-Checkout-connected sites equals more transaction fees. Each ultimately leads to more Google revenue and a rising stock price.
Since it’s hard to argue against the benefits of a speedier site, Google rewarding snappy sites might look like a no-brainer and a win-win situation for all involved. This isn’t the first time Google is considering speed as a relevant metric for search. If you’re an AdWords advertiser, site speed should already be part of the optimization process, as it’s a factor in the Quality Score algorithm that ultimately impacts paid search campaign performance. For organic results, however, it’s a new concept. Some will argue that it will create an unleveled playing field between larger and smaller sites and businesses, due to increasing web hosting requirements.
What About The Little Guy?
The big question is how page loading time will be implemented in the organic ranking algorithm. For example, smaller sites run by local stores or niche specialists might be highly relevant for a competitive user query thanks to good work on traditional SEO (such as link building and content), but due to small budgets, these sites are many times served by relatively slow shared hosting solutions. In comparison, their larger competitors (with all other SEO being equal) with bigger budgets that can afford super-fast dedicated local servers, could gain an edge in the rankings. In this instance, if site speed becomes a large determinative factor to rankings, the end-user may also lose out, if they ultimately would prefer a slightly slower site but more relevant information.
However, considering that site speed will only be one out of several hundred other factors that determine the natural rankings, it’s not likely to become a “small site killer.” One method to possibly avoid this would be if Google weights site speed more heavily for sites and queries where it matters to all stake holders. For example, for a site with AdSense — where increased page views are key to driving revenue for Google and the site owner — the speed factor could be up-weighted, compared to pure informational sources where an acceptable speed range perhaps isn’t going to make as much of a difference to either party or the end-user.
Site Speed Optimization
Page loading speed isn’t merely about hosting, but only one step in the site speed optimization process. Before considering an upgrade, it’s important to take a closer look under the hood. A poorly optimized site with bloated code, browser resized images, and poor caching is likely doing more harm to loading speed and user experience than cheap hosting.
A good starting point is the aforementioned Webmaster Tools functionality which benchmarks your site against a large sample, and indicates the site’s relative performance and page-specific issues. At the moment of writing, this is the only indication available to guesstimate if the site is likely to be impacted by a speed-bound ranking factor (at least negatively). If you’re running a blog or a smaller site (for which there’s no PPC and speed haven’t previously been considered in-depth), you might be in for a surprise on the benchmarking point, since perceived loading time can be very different to what the Google crawler is recording in comparison to the competition.
These tools can only help with what’s visible from a client-side perspective. If there’s inefficient server-side code such as an old WordPress template, a boatload of widgets that pulls fresh data for each page view, or poor database structure, site owners will need to use traditional developer tools to debug and optimize.
Now is the time to consider your options and do the ground work, and in the process you will create happier site visitors.