Predicting Search Engine Algorithm Changes
(Page 1 of 5 )
With moderate search engine optimization knowledge, some common sense, and a resourceful and imaginative mind, one can be able to keep his or her web site in good standing with search engines even through the most significant algorithm changes.The recent Google update of October/November 2005, dubbed “Jagger”, is what inspired me to write this, as I saw some web sites that previously ranked in the top 20 results for extremely competitive keywords suddenly drop down to the 70th page. Yes, the ebb and flow of search engine rankings is nothing to write home about, but when a web site doesn’t regain many ranking spots after such a drop it can tell us that the SEO done on the site may have had some long-term flaws. In this case, the SEO team has not done a good job predicting the direction a search engine would take with its algorithm.
Impossible to predict, you say? Not quite. The ideas behind Google’s algorithm come from the minds of fellow humans, not supercomputers. I’m not suggesting that it’s easy to “crack the code” so to speak because the actual math behind it is extremely complicated. However, it is possible to understand the general direction that a search engine algorithm will take by keeping in mind that any component of SEO which is possible to manipulate to an abnormal extent will eventually be weighted less and finally rendered obsolete.
One of the first such areas of a web site that started to get abused by webmasters trying to raise their rankings was the keywords meta tag. The tag allows a webmaster to list the web site’s most important keywords so the search engine knows when to display that site as a result for a matching search. It was only a matter of time until people started stuffing the tag with irrelevant words that were searched for more frequently than relevant words in an attempt to fool the algorithm. And they did fool it, but not for long.
More Search Engine Tricks Articles
More By Jase Dow