Search engine Cloaking
Tired of the search engine optimization game? Lots of webmasters are, today the Internet is more a big shop than the information library it became so popular for. This of course means that there are hundreds if not thousands of site's competing for the same customers.
Search engines play a very big part in weather company A or company B gets a visitor and potential customer. Webmasters and Internet marketer's know this and hence competition for search engine traffic is fierce. These days it's almost impossible to keep up with the search engines, one day your site could be near the top, the next day your competition could be there and you could be gone completely.
However one particular method is being used by webmasters to enable their sites to rank high and stay high. The method is highly controversial and risky, it's called search engine cloaking.
So what is search engine cloaking?
Search engine cloaking is a technique used by webmasters to enable them to get an advantage over other websites. It's works on the idea that one page is delivered to the various search engine spiders and robots, while the real page is delivered to real people. In other words, browsers such as Netscape and MSIE are served one page, and spiders visiting the same address are served a different page.
The page the spider will see, is a bare bones HTML page optimized for the search engines. It won't look pretty but it will be configured exactly the way the search engines want it to be for it to be ranked high. These 'ghost pages' are never actually seen by any real person except for the webmasters that created it of course.
When real people visit a site using cloaking, the cloaking technology (which is usually based on Perl/CGI) will send them the real page, that look's good and is just a regular HTML page.
The cloaking technology is able to tell the difference between a human and spider because it knows the spiders IP address, no IP address in the same, so when an IP address visits a site which is using cloaking the script will compare the IP address with the IP addresses in its list of search engine IP's, if there's a match, the script knows that it's a search engine visiting and sends out the bare bones HTML page setup for nothing but high rankings.
So Once a list of all the search engines spiders IP addresses have been stored, it's simply a case of writing a script that says something like: -
If IP request = google(Spider IP) then show googlepage.html
If IP request = unknown (other user) then show index.html
So, when the Google spider comes to visit the site, it will be shown a page that is optimized with keywords, meta-tags and optimized content. Because the page is never seen by a casual user design is not an important issue. When a user comes to the site, the server performs the same check and finding that the IP address does not match any in its list, shows the standard page.
Cloaking is also a great way of protecting the source code that's enabling you to rank high on the search engines. Ever read a search engine ranking tutorial that recommends you to model your keyword density, layout, etc on pages that are already high ranking?
Well technically that's stealing and your competition might want to do it to you some day, with cloaking however you can protect your code because when your competition visits they will be sent to the regular page and not the page that's giving you precious good rankings.
Different types of cloaking
There a two types of cloaking, The first is called User Agent Cloaking and the second is called IP Based Cloaking (discussed above). IP based cloaking is the best method as IP addresses are very hard to fake, so your competition won't be able to pretend to be any of the search engines in order to steal your code.
User Agent Cloaking is similar to IP cloaking, in that the cloaking script compares the User Agent text string which is sent when a page is requested with it's list of search engine names (user agent = name) and then serves the appropriate page.
The problem with User Agent cloaking is that Agent names can be easily faked. Imagine Google introducing a new anti-spam method to beat cloakers, all they need to do is fake their name and pretend they are a normal person using Internet explorer or Netscape, the cloaking software will take Google's bot to the non - optimized page and hence your search engine rankings will suffer.
User Agent cloaking is much more riskier than IP based cloaking and it's not recommended.
Search engine cloaking is not as effective as it used to be, this is because the search engines are becoming increasingly aware of the different cloaking techniques being used be webmasters and they are gradually introducing more sophisticated technology to combat them. However in saying that cloaking can benefit your search engine rankings, it's just a matter of being careful.
I would recommend you visit the article entitled SE Optimization and try regular search engine optimization first, if after a few months you are still not seeing good results then you should consider using cloaking technology.
Article by David Callan - email@example.com
David is the webmaster of http://www.akamarketing.com.
Visit his site for free internet marketing articles, advice, ebooks, news and lots more.
| DISCLAIMER: The content provided in this article is not warranted or guaranteed by Developer Shed, Inc. The content provided is intended for entertainment and/or educational purposes in order to introduce to the reader key ideas, concepts, and/or product reviews. As such it is incumbent upon the reader to employ real-world tactics for security and implementation of best practices. We are not liable for any negative consequences that may result from implementing any information covered in our articles or tutorials. If this is a hardware review, it is not recommended to open and/or modify your hardware. |
More Search Engine Tricks Articles
More By Developer Shed