Month: January 2013

Punishment: Google Style

Crime and Punishment – Google Style

This is nothing more than a commentary on selecting the optimisation tactics for a website, in terms of what is “safe”, productive or cost-effective, considering Google’s Webmaster Guidelines.

Some people say that search engine optimisation is not exactly the sort of business a fellow gets into, looking for stability and consistency. I’d argue that it’s a lot more consistent than many trades, at least as long as you’re not one of those that likes to flirt with danger.

And as a consultant, flirting with danger doesn’t seem to me to be the best philosophy when the stakes are your client’s livelihood. As a site owner, possibly with less familiarity of the risk/reward involved with various tactics, it’s equally unwise.

The truth is, the “acceptable” practices for optimising a site’s visibility to search engines hasn’t really changed that much in the relatively short life of the Internet. Certainly, there have been exceptions, but I think it’s safe to say that any practice that was once allowed and is now frowned upon by the search engines (SEs), fell from grace because of abuse.

Injecting Some Common Sense

Just a few of the previously acceptable practices that can now cause problems for a site include reciprocal links and EMD (exact match domain names), as well as using keywords for anchor text and ads plastered all over a page (specifically, above the fold). That’s not to say, however, that those things can no longer be done at all – simply that excessive use is no longer tolerated. Unfortunately, the threshold of “excessive” is up for debate.

As the search engines’ algorithms have evolved, their ability to spot patterns of behaviour have developed considerably. A reciprocal link or two with a couple of sites isn’t going to do you any harm. If a significant percentage of your links are reciprocal, however, that’s almost certainly a different story.

Of course, “significant” is highly subjective, and the search engines aren’t inclined to share the thresholds embedded in their algorithms. So all we can do is make educated guesses, based upon our observations. Even then, there’s still an element of risk in employing any practices that go against the webmaster guidelines.

Since Google is the search engine we tend to deal with the most, their guidelines are those most cited, and frankly, as theirs are the most stringent, it’s usually those we find ourselves crosswise of when we’ve gotten caught tempting fate.

I think most SEOs would agree, the “safest” way to proceed is to simply ensure you’re always operating within the acceptable limits of the gospel according to Google. I think most would also agree, though, that the safe method isn’t always the fastest. And for those of us that are consulting for clients, one of the most common complaints is that “it’s taking too long”.

So What’s the Best Course?

Depending upon circumstances, there may be a number of tactics that can get quicker results, but there’s usually some element of risk involved. The site owner needs to be fully aware of those risks and make the final decision on whether or not to employ any risky tactics.

Aside from the obvious risks of a penalty or filter affecting the site’s ranking and traffic, the client also needs to understand that short-fused results usually have a correspondingly short lifespan – lasting results simply take longer to realise.

An understanding of how algorithms work can be helpful in determining how to proceed, too. An algorithm is nothing more than a mathematical equation, applied against the pages being examined. There’s a fine line between that and a detection mechanism, which is essentially just a true/false statement.

Google probably runs a number of true/false detection programs, but it’s a safe bet that the vast majority of their harvesting is considerably more sophisticated. That makes it probable that most algorithms render a pass/fail grade based on some sort of threshold. Whether that threshold is based upon percentages or absolute numbers is a matter of conjecture, although evidence suggests that most of their thresholds appear to be based upon percentage.

In addition, I think there’s an even more important consideration: even if a site falls short of the threshold in one regard, it may still be at risk if it also approaches the threshold in another. Here, I think that “the whole is greater than the sum of its parts” may apply.

Suppose for instance, that if the “allowable” threshold of reciprocal links in a site’s link profile is 5% and a site is at 3%; and if the threshold for the “allowable” use of EMD anchor text is 10% and that same site is at 8%, then the SE might form an aggregate level of evaluation that finds the site to have passed an acceptable level of deviation from its guidelines. Observance of many sites that have suffered from significant loss of ranking suggests that this is probable.

If so, then even “colouring within the lines” can still be hazardous, if a site is approaching the limits in two or more areas. If true, then it may also be the case that the more areas in which a site is out of compliance with the guidelines, the less deviation in each aspect might be permissible. That could create an aggregate score that puts a site over the collective threshold.

Understand, this is only theory, based upon observation of many sites in various situations, some of which were out of compliance in several areas. But, it is still conjecture. So if you care to test it, do so with caution.

Summary

As I said, some folks think Google makes us walk an optimisation tightrope, but I think that’s only true if you’re pushing the limits. There are few things that have really been significantly changed from acceptable to forbidden fruit, and in my opinion, those have all been changes that were prompted by extensive abuse.

If you or your client want to stay safe, then play by the “rules” – the search engines’ guidelines. On the other hand, if you can’t afford the luxury of slower progress, then at least be sure that you or your client are fully aware of the possible consequences. It’s also advisable to be aware that any tactic you’re using that’s presently allowed, could suddenly be frowned upon if it sees widespread abuse.

It’s much easier to stay out of trouble than it is to get out of trouble.

Shared Web Hosting

Shared Web Hosting vs Dedicated Hosting vs VPS Hosting

We can often become overwhelmed with the number of options available when it comes to web hosting. There are so many sellers, resellers, oversellers – how do we cut through to find out what we really need?

It all mostly comes down to three kinds of hosting option – Shared hosting, Dedicated Hosting and VPS Hosting. These all refer to the way your site is hosted physically in the host’s server itself. When considering hosting, we’re thinking not only about the amount of space our website needs, but also how many people can look at it at once, plus how much information is allowed to be downloaded from your website per month – that is, server space and bandwidth.

Shared Web Hosting

Shared hosting is the most common way to have a website hosted. The server itself is shared amongst a number of websites, relying on the fact that many smaller websites never use their full quota of space and bandwidth. Often hosts offering shared web hosting are, in fact, a reseller of server space bought elsewhere – which explains the large number of cheap hosting providers who offer very similar packages.

Shared web hosting can be a great option for personal sites and small businesses – it’s a cheap way to get your small site hosted, especially if you are looking for a web presence but don’t expect to have a lot of large files transferred (ie media streaming) or lots of people trying to access your site at the same time.

Shared web hosting is not good for larger businesses or e-commerce sites though. Larger businesses may find that they have issues with bandwidth – even though sites on shared hosts may be offered a particular bandwidth limit, in reality this is often oversold, as hosts allow for the unused amount in shared host sites. So even though you may think you have up to your limit, if all the other sites sharing your server experience a high demand at the same time, your clients may not be able to access your site when required.

The prevalence of reselling also means that you may not know the physical location of the server your website is hosted on. Because shared host sites all share the same IP address, if your site is sharing space with a disreputable site, simply sharing the same server could mean your SEO ranking is affected. It’s possible to have a site on a shared server with a unique IP, though usually costs more.

Dedicated Hosting

Dedicated hosting is just as it sounds – one server dedicated to one website. Dedicated hosting offers more choice regarding the operating system and management of the server, better security – both security of your data and of the physical location of your data – and is good for larger e-commerce businesses, corporations and high-bandwidth websites (such as video streaming).

VPS – Virtual Private Server

A Virtual Private Server is a server partitioned so that each partition operates as if it is a dedicated server. It’s a compromise between shared and dedicated hosting: cheaper than dedicated hosting, as the hardware is shared – yet appears to the client as a dedicated host.

VPS (and related Cloud technologies) rely, in the same way as shared hosting, on clients not using their total quota of space and bandwidth (which is predominantly the case in the real world). A good host will manage their servers to avoid traffic ‘bottlenecks’ or too much traffic for the server to handle.

VPS is good for medium to large e-commerce businesses as it combines price with private service – though not suitable for all clients as some software doesn’t run well in a virtual environment (like other virtualisers or emulators, for example).

Conclusion

As with anything: research, research, research and remember you get what you pay for!

At Explainafide we’ve used dozens of different web hosts from across the world for a variety of clients with different needs, budgets and locations. We’ve finally settled with an Australian web host company Rack Servers based in Brisbane who provide great service, technical know-how and servers that work 99% of the time.

Check Rack Servers out here…*

If you would like to get unbiased web hosting reviews then the Whirlpool forums are packed full of tech savvy people far more knowledgeable than I, all discussing the good, the bad and the downright unethical of web host companies large and small.

*I like the guys at Rack Servers so much I’ve started writing for them*