What to Do if Your Site Has a Penalty or Ban - Keep your Robot in Check
(Page 2 of 4 )
If your site has fallen in rankings or disappeared, I would say your first action would be to check out your robots.txt file. If you have recently renovated your website, chances are you forgot to upgrade your robots file. This could block sites from crawling certain sections of your website. For instance, maybe when you were testing your pages, you specified in your robots.txt file that spiders shouldn't crawl your beta pages; it wouldn't do to have a visitor do a search and find an unfinished page. It could turn them off to your site forever.
Redesigning a site can be hectic work, and you may forget to reconfigure your robot protocol. So be sure to check here first. Additionally, you may wish to check on your robot exclusion meta tags as well.
While we are on the subject of website renovations, keep in mind that if you include newer technologies on your website, the search engine spiders might not be able to crawl them. A good example would be if you required cookies on your website. Those spiders don't like cookies; they do, after all, have to watch their figures.
Another issue that can cause spiders to ignore your website is if you require a user to log in. Obviously, and thankfully, a spider cannot log into your website.
Finally, other issues that can cause spiders to ignore pages on your site include using non-SEO friendly URL structures, such as those created using certain content management systems, or using a navigational scheme that cannot be crawled (i.e,; an image-based navigator with no alternate text, or form-based navigation). Also, make sure that there are no hidden elements, such as text or hyperlinks on your site (text and hyperlinks with the same color as the background).
More How To Articles
More By James Payne