November 8, 2015 admin No Comments

Unlike, say, gravity, nuclear fusion and thermodynamics, SEO is not a law of nature. It follows no absolute guidelines that are absolutely true throughout time. When I am asked how much traffic my optimization methods could generate I offer an answer that will most likely be proven wrong: I will either be wildly successful – or not.

Luckily, I’ve mostly erred on the side of wildly successful, blowing most estimates completely out of the water…but, there is always the fear of not succeeding as much as I think I should have.

I’ve encountered situations where I quadrupled traffic, and felt like a loser, because I expected twice that. My error was not in optimizing the website: my error was in the estimate.

In SEO, there are no guarantees.

Google, the 800 lb gorilla search engine defines its own laws of search engine physics. And the laws can change at any time. “Cloaking”, or serving content to the search engines that might be different than what the user sees – was just an SEO technique that worked – until it didn’t.

The shot fired across the bow of the SEO world was in 2006 when BMW was banned from Google results for applying cloaking. There was a collective “Whoa!” heard across SEO world. A major website was unfindable. It also revealed the power that Google wielded. Like, who would actually type in “BMW.DE”? Not even a dotcom domain. That’s right: nobody. Or maybe just internal BMW folks. Can we just round that down to zero?

That marks the time that SEO officially became hard. All the blackhatter techniques were now suspect, and using anything not squeaky-clean white-hat SEO put websites, and clients, at risk.

In SEO, everything is fair game – until it’s not. Thin content was just an SEO technique, employed by the likes of the ehow’s of the world – until it was not. Until Google decided it was not. The rules of Google are mutable and mysterious. Which grants SEO professionals the mantel of “Artist” as in “SEO is the Art of optimizing for search engines.”

I don’t think that’s particularly correct. I think it’s better to be a scientist about it, to create a hypothesis, and then test rigorously. Don’t trust what Google puts out in press release and blog posts. For example, their post on being able to crawl Ajax website was not correct. I personally experienced this on a client website. Developer’s were telling me that Google stated their Search Engine bot can now read Ajax; that there were many articles that stated that Google can read Ajax websites. Emphatically, I responded:

‘That is incorrect. I know you read that, but in practice that is incorrect.’

The same for human-readable words in URLs. I know from experience that human-readable URLs are a primary ranking factor. That strings of incomprehensible ids just don’t fly McFly. I’ve been challenged by developers on this point as well. They would cite reading ‘…numerous documents state this…’ But if you think about it in regards to human factors – it makes sense.

Google has no problem reading URLs that are incomprehensible gibberish to the rest of us – why should they? It’s simply code. But humans need words to communicate. Words describe – and an URL that is easy to say, easy to spell, reads like a sentence, has human factors that encourage interaction.

Google is like Pinnocchio: a toy who wants to be a boy. For our example, Google is a machine trying to think like a human being. Trying to figure out what matters most to human beings. An SEO strategist, on the other hand, is a human being trying to think like a machine that’s trying to think like a human being. What metrics would translate to human factors? Time on page impliescontent that is valuable enough to take the time to scroll, read and digest. Page Views impliesContent Depth – interesting enough to encourage a clickthrough. A bounceback implies thin content, and lack of content depth not worthy of further viewing. Playing to those metrics then plays to optimization techniques.

You have to read between the lines of recommendations. For example:

Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link

Why? Why text links? Why not javascript hide/show links, or flash links or…whatever?

One reason is to be able to serve the blind and visually impaired. We are not in a completely “abled” world. Text readers for the blind stumble on javascript only links. So that super-advanced website all Ajaxy and shit might be invisible to text readers. Great, you’ve just made the blind more blind. How does that feel? There are other reasons besides the ADA adherence, but being a decent website steward should be enough, ranking lower is just the stick of the ranking higher carrot.

The goal of Google can be stated as: “No more crappy website results.” If you stay on the side of not catering to a “traffic-at-all-costs” mentality, you may find that there are plenty of so-called “techniques” available to rank higher.

The problem is: Google doesn’t tell you the rules of its game. It’s not in their best interests to do so. Once they let it be known that links were a key to their ranking algorithm – Search Engine Optimizers all of a sudden got into linking “techniques” later known as “schemes.” SEOs are nothing if not opportunists. We don’t ask philosophical questions like: Why are they called apartments, when they’re all stuck together? We just live in them.

We are pragmatists: we use optimization techniques, all the while trying to stay on the right side of the law Search Engine algorithms. We try to future-proof websites so that they do not fall prey to what Google decides what was once a “legal” technique is now “illegal.” For the best, our pragmatism extends to preventing damage as much as accumulating traffic.

In that sense, I guess it’s an art. Tiptoeing the knife-edge of the thin grey line separating decent search results from sucky ones chaos. Google is the caretaker of results, with the power to withhold traffic and reserving the right to taketh away what it has so generously giveth.

For the SEO, the end result is to eliminate the fear, proceed with effective strategies, and forget the fear of traffic loss.

When a client asks what kind of traffic they can expect, I give them my best estimate based on past experience, weighing the quality of their website benchmarked against my mental list of comparable websites, the competitive environment and the state of Google at the moment. I state it, knowing, in the end that I am wildly wrong, one way or the other.

Leave a Reply

Your email address will not be published. Required fields are marked *