Google Caffeine formulates a new SEO


A large team in Google has been working for the last several months to develop the Google Caffeine, the next-generation architecture for Google’s web search. It’s the initial pace in the development that will let us push the envelope on size, indexing speed, accuracy, comprehensiveness and other dimensions.

In the new infrastructure there is an increased weightage for domain authority and some authoritative tag type pages ranking, perhaps slightly more weight on exact match domain names, perhaps a bit better understanding of related words or synonyms, tuning down some of the exposure for video and some universal search results.

SEO firms are trying to study and diagnosing unprecedented results in organic rankings that could change the possibility for rankings and revenues. Although you should never be too heavily vested on one conversion path or marketing medium alone, the sad truth for many businesses is that organic traffic is one of the most cost-effective forms of conversion.

It is possible that a site that previously held top 10 positions for years could slip considerably and require an entire revamping of links, off page factors or tweaks to keep in good standing with new caffeinated algorithm.

There should be an analysis of which SEO techniques will remain relevant depend on one significant factor, how natural authority, links or search engine ranking factors can validate the originality of intent.

It is logical to expect that the “new caffeinated filters” will identify patterns of automation and parse and purge the existing index by observing timestamps, IP ranges, semantic anchor text clusters and other unique signatures.

As exciting as the news is, this could spell disaster for some businesses that thrive as a result of their organic positioning. Most of the SEO process is having a solid point in which to anchor and assess other related points to find your bearings.

Without knowing which metrics to assess first, countless hours could be burned up in an attempt to identify signals that produce consistent reactions or results. I wouldn’t necessarily call it a set back as much as a challenge for those who were resting on their success.

For those sites that have been striving away creating exclusive content always, applying relevant internal links and applying best practices, I suspect you will start to see significant increases in the range and volume of monthly referrers from Google as a result of the new crawler technology. Essentially, you have nothing to fear, it is the less diligent gray areas that are most likely the target and reason behind this revision in an attempt to reclaim the web from excessive amounts of spam.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: