Walking in the Search Engines' Shoes
You know the old saying about not criticizing someone until you walk in their shoes. But have you noticed how critical many SEOs and Webmasters are when it comes to the search engines, when they've never walked "in their shoes"? Spam is a perfect example.
The search engines have publicly stated that over 90% of submissions made through free add URL pages are spam. A few years ago, a FAST search engine representative showed me some examples of spam how they tracked spammers, the sheer amount of spam coming from just one spammer, and so forth. Can you imagine how much better their strategies are now for catching spammers?
It's almost impossible for us on this side of the fence to fathom the spam problem from the search engines' side of the fence.
Suffice it to say that spam is a major issue with the search engines in terms of time, resources, and ultimately money. New filters are added by the engines to combat spam, and the Web community finds new ways to cheat their way to the top. Algorithms are changed, and sites get banned. Webmasters scream about how unfair it is, and the saga continues.
Let's STOP the Nonsense!
Is it fair to complain about "injustices": being banned; the sandbox; supplemental index; etc., if your site was banned or penalized for not following the engine's guidelines?
Google, in particular, has clearly stated that it wants content of value to the Web audience, natural link building, no spam, and other quite logical Webmaster guidelines.
Take a Walk in Google's Shoes
Your "Role": Honorary Google Editors
Let's take a walk in Google's shoes. What would it be like to be an editor of a search engine? What would it be like to have to deal with spam from their side of the fence?
How lenient should you be? At what point would your patience wear out? Would you make exceptions to the rules? Why, and when?
Remember: You've posted spam and Webmaster guidelines on the Internet, but the guidelines are continually ignored. The envelope is being pushed to the limits.
As a search engine, you want your SERPS to be relevant. If they're not relevant, you lose your market share.
We conducted some research in this area by putting on some editor shoes. It was an amazing learning experience.
One of the keyword phrases we chose was Viagra. We wanted a highly competitive keyword phrase in terms of the number of searches performed on a daily basis, the number of competing pages, and the likelihood of a high number of SEOs who are actually competing for the phrase. This keyword met the criteria.
At the time of our research, there were 47,800,000 competing pages in Google, with a 24-hour potential for traffic in Google (using Wordtracker numbers) of 3811. The popularity figure for the keyword across all of the major engines is 6226 (over a 90- day period across all the engines that Wordtracker gets its data from).
The site we chose was actually #10 out of the top 10 results, but it's no longer in that position (more details to follow).
More About the Site
The site was a one-page sales letter with content coming straight from the manufacturer. I found 572 other sites on the Web that were using the same exact sales copy.
The keyword was being used 56 times in the visible body text. In other words, the keyword density was way over a reasonable number.
There were no inbound links whatsoever. None at least none that were recognized by Google.
So, what do we have? Keyword overuse, duplicate content, and no off-page factors.
However, the site actually read well. From a user experience, if they put "viagra" in the search box, they would land on a page that sold the product. It didn't look or sound spammy in any way. It was written by the manufacturer as a sales letter.
You have to assume that some people who put "viagra" into the search box are looking to buy the product. If they clicked on this site, their query would match their expectations.
How did this Page Get in the Top Results?
It's important to note that the page slipped to #35 within a few days after we found it at #10, and within a week, it slipped out of the top 100 listings. So, its rankings didn't last.
When you optimize pages, don't you want your rankings to last to stand the test of time? You should!
There's a reason this page landed on the first page of results, but the reason is immaterial. The trick didn't work, plus by mentioning it here, I'd be opening the door for people to abuse it to try to get top rankings. Again, the reason doesn't matter. It didn't work.
Looking at the Current Top 10 Results
In studying the top 10 SERPS at the time of this writing, Google is showing a wide variety of results. The first two results are the main Viagra.com site. Very appropriate.
The next two results are the FDA.com site. Again, very appropriate. All of these sites are informational sites about the product.
Other sites are devoted to real people's experiences, other medical-related sites, risks related to the pill, etc. Some of these sites may have places where you can order the medication, but they're more informational sites.
Of course, Google AdWords offer sites where you can purchase the medication.
So, you can see how truly relevant the top 10 results are. This is a very important point to understand.
What Else did We Learn?
Spam is in the "eyes" of the editor. We found a site that actually read well and would serve up relevant results to the end user.
Yet, it was using what I consider to be "spam" strategies. They were overusing their keyword phrase. They were using duplicate content used by hundreds of other Web sites.
However, is this the same as using AP copy across the Net for news-related sites?
The site was of low quality, but does that make it spam?
The Webmaster guidelines don't say anything about low quality Web sites. In fact, Google's Webmaster guidelines say this:
"Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, 'Does this help my users? Would I do this if search engines didn't exist?'"
This Web site was simply using promotional material to promote their product. It helped the end user by selling the product they were searching for. Yes, the promotional material "overused" the keyword phrase and had been used all over the Web.
This site proves that spam is subjective, and Google editors really have their hands full. It's much harder than it looks.
Though we all know what constitutes spam, spam is subjective in many cases. It's not all "black and white."
Matt Cutts discussed on his blog how innocent Web site owners often use strategies and don't realize they're spam. Google takes a different approach to those type of "spammers," versus the more sophisticated spammers who are trying to trick and cheat.
So, spend some time walking in the search engine's shoes. Think about the sheer amount of spam they get, and how they have to protect their engine from it. Make sure your own sites are playing it straight, and you'll be helping us all.
By Robin Nobles
Robin Nobles conducts live SEO workshops (http://www.searchengineworkshops.com) in locations across North America. She also teaches online SEO training (http://www.onlinewebtraining.com). Sign up for SEO tips of the day at mailto:firstname.lastname@example.org.
Back to articles Directory
Copyright © 2006 Web World directory