Search Engines and the Notorious Nofollow Debacle
About The Author:Maura Stouffer is Co-Founder and President, Marketing & Operations for SEO Engine. SEO Engine is the first Search Engine to publicly reveal its scoring engine via a navigable and transparent user interface. Ms. Stouffer is a graduate of Cornell University and brings a unique perspective and personal touch to the high-tech company. Her career began in the Silicon Valley during the dot-com era and her diverse experience in marketing, design, management, and business ranges from entrepreneurial endeavors to design firms and technology start-ups. For more information about SEO Engine visit www.seoengine.com or email firstname.lastname@example.org.
“The future of SEO will be defined as a battle of robots vs. robots, and the industry must continue to adopt, support, and develop advanced software and tools that enable humans to see precisely how Search Engines view the world.”
One of the most interesting things about the Search Engine Optimization industry is the constant cat and mouse chase that occurs between Search Engines and their dependents – SEO companies, Web professionals, and Web-based businesses. Simultaneously, there exists two worlds which are both reluctantly co-dependent and in constant tension with one another. The first world is filled with autonomous robots (backed by multi-billion dollar companies and developed by top engineers and scientists) that crunch massive numbers of calculations per second and decipher the Internet through the eyes of 1's and 0's. The other world, for the most part, is populated by non-engineers and non-scientists (however, this is quickly changing), who are constantly chasing the robots in an effort to make their living. In this article, I will detail out my interpretation of one of the latest debates on whether or not to use “nofollow” attributes on Links – a debate which is causing a lot of confusion within the SEO industry, and I will discuss how this debate is viewed from a Search Engine’s perspective.
For years now, Google and other major Search Engines have been attempting to normalize the playing field between humans and robots. In the early days of SEO, “black-hat” techniques were used to cloak Webpages – showing one version to humans, and another version to robots. Link Farms were deployed, in an attempt to artificially raise the perceived popularity of certain Webpages. For the better part of the early 21st Century, Search Engines continually improved their algorithms to penalize the Webpages and Websites which attempted to hijack the Search Engine Results Pages (SERPs). The Search Engines’ goal was to eliminate artificial techniques so that the ‘online world’ closely represented the ‘real world’.
In 2005, Google announced, via the Official Google Blog, that they would be recognizing a new attribute value on Links, called “nofollow”, in an attempt to combat spam on social media Websites. According to the blog post, Yahoo! and MSN followed suit. By simply placing a rel=“nofollow” attribute value on a hyperlink, they said, it would prevent “credit” (PageRank) from flowing to the Webpage being linked to, thus preserving that value from flowing through to other Links on that Webpage. (PageRank is Google’s trademark/patent for their public Webpage ranking indicator/algorithm). The idea was that comments left on blogs, for example, could not be “vouched for” and thus a method was created to flag spam and untrustworthy information for the Search Engines. Other implementations adopted by the SEO industry included preserving Link Flow and preventing Link Loss within a Website which contained “Paid Links” and repetitive non-editorial Links such as footers, nav bars, etc.
As in all things in nature (especially within this dynamic ecosystem), every action is followed by an equal and opposite reaction. In an attempt to increase their attack on spammers and “black-hat” techniques, Search Engines had inadvertently created a whole new industry in SEO, called “PageRank Sculpting”. They had created exactly what they were fighting against – the ability to artificially change what the robots were seeing vs. what the humans were using. With “nofollow”, PageRank could be strategically manipulated to flow to specific Webpages that needed it most, even though the underlying Link architecture told an entirely different story.
After years of “nofollow” misuse, Google and other Search Engines apparently rethought their original strategy. And so quietly they implemented changes to their internal algorithms, which ensured that the abuse would stop. New algorithms, a few years later, now treat “nofollow” as a way to decrease Webpage penalties, not a way to sculpt PageRank. Now, ranking value would not be redistributed to the other Links on that Webpage and instead, would simply “evaporate”. Links tagged with a “nofollow” attribute would now become “Dangling Links”, or Links that are essentially a PageRank “black hole”. This solved the business requirements of a Search Engine very nicely. Putting “nofollow” on an editorial Link would be discouraged, because why not distribute the Link Flow that would otherwise be lost anyway? And putting “nofollow” on a non-editorial Link would be encouraged, since non-editorial Links were already being penalized with very little Link Flow value passing through. In addition, “nofollow” helped eliminate some of the penalties associated with too many External Outgoing Links that were non-editorial.
And so yet again, Search Engines ruled the universe. Another loophole was closed, and another strategy flattened. But not to fear – “PageRank Sculpting” goes far beyond “black-hat” techniques. The new rule on “nofollow” simply means that Link Flow needs to be sculpted as part of a Website’s natural Link architecture. The companies that stand to benefit most from these changes are those who earn their living showing humans how a Search Engine views a Website's Link architecture, and how to modify it by naturally adding and removing Links within each Webpage to achieve the overall goal. “PageRank Sculpting” will now entail a more organic technique. Every day it becomes more difficult for humans to compete with the increasing intelligence of Search Engines. The future of SEO will be defined as a battle of robots vs. robots, and the industry must continue to adopt, support, and develop advanced software and tools that enable humans to see precisely how Search Engines view the world.