03 Apr 2012

Is White Hat SEO Dead?

Several years ago when I started learning about search engine optimization (SEO) in detail, the emphasis was always fundamentally to improve a website for the human visitor. A site owner should strive to provide great content, make the content easily found, and in the spirit of the web, link out to other resources to further enhance the user’s experience in searching for information.

This approach to site design obviously had repercussions for search engines. Providing great content meant that each page had to be thematically consistent and relevant to the overall theme of the site. The content had to be accessible for the search engine to consume, which primarily meant that the emphasis should be on text-based content (which is most easily consumed by search engine crawlers) rather than heavy reliance on digital, binary media.

Making content easily found meant that the architecture and navigation schemes were important considerations for a site. For example, burying content down many levels deep in a directory structure implied that it was much less important than other content closer located to the site root. Using GUIDs in folder and file names conveyed no helpful information about the content they contain. On the other hand, using an ample amount of intra-site links between relevant pages buttoned up the content within a site nicely, helping users to really benefit from the effort of a well-thought out site content plan.

Lastly, linking to other pages extended beyond just within an intra-site scope. The web is a vast place of interconnected sites, sharing information and value among one another for the benefit of human readers. And of course, the quality and relevance of those links were pretty important. As such, the quality, relevance, and to a degree, the number of links your site receives in return reflects the perceived value of your site to others on the web.

Of course, there are hundreds upon hundreds of factors that go into how a site earns rank. But since search engine crawlers are stupid compared to people, webmasters can make provisions, such as employing useful metadata text, to describe the content on the page to search engines. This is part of the process through which keywords are established for evaluating a page’s relevance to a query.

So SEO is fundamentally the practice of helping search engines understand what a site is all about, giving them information to use for establishing how a site might be relevant to a user’s query, and demonstrating how the content on a site is to be judged as valuable.


Defending against web spammers

Of course, as soon as such a framework for helping search engines better understand more accurately the contents of a site appeared, there were those ready to actively exploit that framework for malicious manipulation. Web spammers wanted instead to cheat their way to the top rank, to earn the greatest visibility for their pages in search, regardless of their relevance to the query. Of course, the search engines had to respond to that threat to the integrity of their services. The search engines’ goal is to provide the best possible results to a user’s query. Their stack-ranked results have to be relevant, informative and valuable in order to keep searchers coming back. Thus the web spam wars began.

In days past, it was pretty clear what was white hat (legitimate) versus black hat (malicious) SEO. Yes, the borders between the two could be a bit blurry and gray at times, but the intent of such specific techniques as keyword stuffing, hidden text, cloaking, and many others were clearly all about malicious manipulation of the search engine index and ranking algorithms.


It’s a new day in SEO – or is it?

On March 10 at the SxSW conference in a session called “Dear Google & Bing: Help Me Rank Better!,” Matt Cutts of Google uncharacteristically pre-announced a coming algorithm update. This change, as Cutts explained, is intended “to sort of make that playing field a little bit more level” for sites in search. It was presented as a penalty for over-optimization, presumably a diminishment of page rank for sites that do SEO to excess, for the benefit of those who do not do SEO.

The problem here is there’s no clear definition as to what is meant by “over-optimization” (not yet, and who knows if there ever will be). Will Google actually penalize a site for having clear, concise, and well-formed <title> tag text? Will the repetition of a keyword one too many times trigger this new penalty? How many times is too many? More than once? Will having descriptive <img> alt attribution text knock a page off of Page 1 search engine results pages (SERPs)? Cutts tosses out possible examples of over-optimization being, “…whether they throw too many keywords on the page, or whether they exchange way too many links, or whatever they are doing to sort of go beyond what a normal person would expect in a particular area.” (Who is this normal person, anyway?)

I fully understand Google’s reluctance to go into genuine detail here. The black hats would pounce all over any specificity they reveal. But my bigger question is if we may be seeing the elimination of what was commonly known as white hat SEO – legitimate efforts to help clarify the definition of the page content, provide descriptive keywords for relevance, and the like. Is it Google’s intention to penalize a page for even having <title> tag text? Do we really want to go back to the days when a sizable percentage of <title> tags used the text “Page 1”? That level of incomplete page development was certainly not useful to people when scanning the blue links of SERPs.


I’m skeptical

I frankly can’t believe Google would do something as radical as penalize well-formed, well-written page development practices in favor of amateur page development practices. All other things being equal (such as having great content), SEO was intended to help great pages be found, crawled, indexed, and served by search. The Google Panda algo updates have been hammering on pages whose content is deemed by Google as not being up-to-snuff. I see any such future over-SEO penalty as being related to this concept, where strong SEO cannot mitigate the inadequacy of light-weight content.

To back up my assertion, listen to the recording of the SxSW session where Cutts says his piece, rather than just reading the snippet posted about the event (it gets to this topic about 1/3 the way through). The session is hosted by Danny Sullivan of Search Engine Land and also has Duane Forrester of Bing in attendance.

After Cutts makes his big announcement, Sullivan asked Cutts if Google was “going to release an algorithm that was necessarily designed to hurt the SEO kinds of things you encourage people to do?” Forrester chimed in that he heard Cutts say, “If you’re doing SEO, you’re screwed in two months.” Cutts emphatically denied this. He said that it is “definitely not the case” that Google hates SEO. He even refers to an older Google Webmaster video that states that Google does not hate SEO (and perhaps this Web 2.0 Expo video from 2008 is relevant as well).

Cutts goes on to say that SEO can be very helpful, making a site more crawlable, more accessible, and because of the effort put into keyword research and planning, it can make a site more user friendly, all of which is good. He further states that the algo change is reflective of white hat SEOs complaining about black hat SEOs seeing the ranking success they receive. The basis for this change is similar to the above-the-fold, ad-heavy algo change from earlier this year (that reportedly affected only 1% of searches): will searchers who land on a page using excessive SEO tactics be unhappy with the results? He finished with the key statement, “If you keep that in mind, you should be in good shape no matter what.”

Yeah, Cutts seemed to bounce back and forth on the issue, and his comments were nebulous, but that’s pretty much all we can expect while the web spam wars rage on.


What can we conclude?

Ultimately, as Forrester stated, if SEO is used to point to relevancy, then this is good and proper (he even recently published a blog post about how Bing likes SEO). He also stressed the importance of social signals, and how a social campaign can affect an associated site’s rank, whether or not the site owners are participating in the conversation themselves (which is why it is important to not only participate, but lead that social conversation!). Google already had Webmaster Guidelines in place that warned against the use of malicious web spam techniques, and has actively penalized sites found to be in violation.

The question, still to be answered, is whether this coming algo change is merely a strictly-limited, tighter lock-down on familiar black hat SEO techniques or is it something larger, more antagonistic to general users of traditional SEO methods. Time will tell if there is no longer such a thing as white hat SEO as we knew it. What do you think?