Punishment: Google Style

Crime and punishment – Google style

This is nothing more than a commentary on selecting the optimisation tactics for a website, in terms of what is “safe”, productive or cost-effective, considering Google’s Search Essentials.

Some people say that search engine optimisation is not exactly the sort of business a fellow gets into, looking for stability and consistency. I’d argue that it’s a lot more consistent than many trades, at least as long as you’re not one of those that likes to flirt with danger.

And as a consultant, flirting with danger doesn’t seem to me to be the best philosophy when the stakes are your client’s livelihood. As a site owner, possibly with less familiarity of the risk/reward involved with various tactics, it’s equally unwise.

The truth is the “acceptable” practices for optimising a site’s visibility to search engines haven’t really changed that much in the relatively short life of the Internet. Certainly, there have been exceptions, but I think it’s safe to say that any practice that was once allowed and is now frowned upon by search engines, fell from grace because of abuse.

Injecting some common sense

Just a few of the previously acceptable practices that can now cause problems for a site include reciprocal links and EMD (exact match domain names), as well as using keywords for anchor text and ads plastered all over a page (specifically, above the fold). That’s not to say, however, that those things can no longer be done at all – simply that excessive use is no longer tolerated. Unfortunately, the threshold of “excessive” is up for debate.

As search engines’ algorithms have evolved, their ability to spot patterns of behaviour has developed considerably. A reciprocal link or two with a couple of sites isn’t going to do you any harm. If a significant percentage of your links are reciprocal, however, that’s almost certainly a different story.

Of course, “significant” is highly subjective, and the search engines aren’t inclined to share the thresholds embedded in their algorithms. So all we can do is make educated guesses, based on our observations. Even then, there’s still an element of risk in employing any practices that go against the webmaster guidelines.

Since Google is the search engine we tend to deal with the most, their guidelines are those most cited, and frankly, as theirs are the most stringent, it’s usually those we find ourselves crosswise of when we’ve gotten caught tempting fate.

I think most SEOs would agree, the “safest” way to proceed is to simply ensure you’re always operating within the acceptable limits of the gospel according to Google. I think most would also agree, though, that the safe method isn’t always the fastest. And for those of us that are consulting for clients, one of the most common complaints is that “it’s taking too long”.

So What’s the Best Course?

Depending upon circumstances, there may be a number of tactics that can get quicker results, but there’s usually some element of risk involved. The site owner needs to be fully aware of those risks and make the final decision on whether or not to employ any risky tactics.

Aside from the obvious risks of a penalty or filter affecting the site’s ranking and traffic, the client also needs to understand that short-fused results usually have a correspondingly short lifespan – lasting results simply take longer to realise.

An understanding of how algorithms work can be helpful in determining how to proceed, too. An algorithm is nothing more than a mathematical equation, applied against the pages being examined. There’s a fine line between that and a detection mechanism, which is essentially just a true/false statement.

Google probably runs a number of true/false detection programs, but it’s a safe bet that the vast majority of their harvesting is considerably more sophisticated. That makes it probable that most algorithms render a pass/fail grade based on some sort of threshold. Whether that threshold is based upon percentages or absolute numbers is a matter of conjecture, although evidence suggests that most of their thresholds appear to be based upon percentages.

In addition, I think there’s an even more important consideration: even if a site falls short of the threshold in one regard, it may still be at risk if it also approaches the threshold in another. Here, I think that “the whole is greater than the sum of its parts” may apply.

Suppose for instance, that if the “allowable” threshold of reciprocal links in a site’s link profile is 5% and a site is at 3%; and if the threshold for the “allowable” use of EMD anchor text is 10% and that same site is at 8%, then the SE might form an aggregate level of evaluation that finds the site to have passed an acceptable level of deviation from its guidelines. Observance of many sites that have suffered from significant loss of ranking suggests that this is probable.

If so, then even “colouring within the lines” can still be hazardous, if a site is approaching the limits in two or more areas. If true, then it may also be the case that the more areas in which a site is out of compliance with the guidelines, the less deviation in each aspect might be permissible. That could create an aggregate score that puts a site over the collective threshold.

Understand, this is only theory, based upon observation of many sites in various situations, some of which were out of compliance in several areas. But, it is still conjecture. So if you care to test it, do so with caution.

Summary

As I said, some folks think Google makes us walk an optimisation tightrope, but I think that’s only true if you’re pushing the limits. There are few things that have really been significantly changed from acceptable to forbidden fruit, and in my opinion, those have all been changes that were prompted by extensive abuse.

If you or your client want to stay safe, then play by the “rules” – the search engines’ guidelines. On the other hand, if you can’t afford the luxury of slower progress, then at least be sure that you or your client are fully aware of the possible consequences. It’s also advisable to be aware that any tactic you’re using that’s presently allowed, could suddenly be frowned upon if it sees widespread abuse.

It’s much easier to stay out of trouble than it is to get out of trouble.

Rob Jennings

When he found himself in a business conversation with someone talking about their ‘customer-centric core competencies’ he realised it was time to create a digital agency that was less about self-promoting buzz-words and more about the practical endeavour to assist clients in making effective use of the web.