Designing Your Web Site to Maximize Traffic (Part 2)
By: Brendon Turner
Designing your site to maximize traffic from the search engines is not a difficult task but it does require you to think ahead and plan your SEO strategy carefully. If you have not yet built your web site and are still in the initial planning stages then you may have an easier time of it. If you already have an existing web site, then you may need to take the time to read up on these SEO strategies and make some changes to incorporate them into your web site.
I will discuss 13 ways in which you can improve on your existing web site or boost a brand new web site into the stratosphere of high rankings. These are not really SEO tricks but rather tried, tested and true methods that we know to work effectively. We all know that in reality there are no real SEO tricks. True success is achieved through hard work, research and implementation of a thorough and complete SEO strategy.
Without further ado here is a checklist of important items to consider and implement into your SEO strategy.
Using Style Guidelines Effectively
If you are using CSS style commands, do not include them within your actual web page source code. You don't want search engine spiders to have to wade through 100 lines of unreadable code before they reach your actual content. Instead, place your style guidelines into a separate CSS file and call them with a single line of code from within your <head> and </head> tags by using the following code:
<link rel="stylesheet" href="replace-with-style-file-name.css">
Primary Keyword Layout
Examine your web site from a source code point of view and ensure that your primary keywords or phrases will be spidered first. When search engine spiders read your page they read the source code just like we would read a book from left to right - top to bottom. We know that search engines place higher relevancy on keywords and phrases which appear closer to the top of a page so it stands to reason that if you've got a large table full of graphics appearing at the top of your source code before your primary keywords, then you can achieve higher rankings by adjusting your layout and placing a well written search engine optimized paragraph above that table full of graphics.
Spiderable Text Present on Each Page
Many times I have seen some very pretty web sites, but their chances of ranking high for any relevant keywords have been dashed by the use of only graphics and very little or no text on the pages. It is very important to your SEO strategy that you make sure that you've taken the time to write some quality textual content for your pages. Don't write nonsensical text filled with blatant sp@m. Instead, take a few extra minutes and write 4-5 quality paragraphs which clearly explain the theme of your site and the particular page your writing for. If you feel you haven't got the time, ambition or skills to write your own content don't stress over it. There are other options. You could visit the SEO Copywriting forum at FreeSEOAdvice.com to ask for help and get some valuable tips. Or you could hire a professional team to handle the project for you. Check out eTrafficJams.com for help with your SEO strategy and copywriting needs.
Proper Use of robots.txt File
On several occasions I have performed an analysis of a client's web site only to discover that they had inadvertently blocked spider access to their web site by incorrectly formatting their robots.txt file. It is critical that you know what you're doing when you use a robots.txt file. If you are unsure of the correct syntax when modifying or creating a robots.txt file, I recommend you not use a robots.txt file at all. This may sound counterproductive, but it's better to be safe than sorry. Accidentally blocking the spiders can result in a loss of all your rankings. It would almost be like starting over again to repair the damage. For help on correctly formatting your robots.txt file, visit robotstxt.org.
Dead Links and 404 Errors
If you are not checking for broken links on your web site, then you should start immediately and make this a part of your SEO strategy. You can never be 100% sure of your link integrity, especially when your site has 100, or more, pages. Aside from losing potential customers into a vortex of 404 errors, you risk more than that from a search engine perspective. When a search engine spider visits your web site and finds broken links, the impression left is that your site is not regularly maintained and updated. Not much is known about how the engines view this, but your crawl status may be assigned a low priority by visiting search engine spiders. In other words, the spiders may not visit your site as frequently as they visit sites with 100% link integrity. So make sure you download some link checking software and begin a regular schedule of verifying your link structure.
About The Author
Brendon Turner is a certified search engine optimization specialist and Senior SEO Consultant with eTrafficJams.com, a professional search engine optimization company that specializes in getting targeted, eager-to-buy traffic to your site! Brendon also maintains FreeSEOAdvice.com, a progressive free SEO forum advocating access to accurate and timely information related to the industry. Want some tips on how to improve your web site's optimization? Call 877-785-9977 or click to request a free search engine optimization analysis.
| DISCLAIMER: The content provided in this article is not warranted or guaranteed by Developer Shed, Inc. The content provided is intended for entertainment and/or educational purposes in order to introduce to the reader key ideas, concepts, and/or product reviews. As such it is incumbent upon the reader to employ real-world tactics for security and implementation of best practices. We are not liable for any negative consequences that may result from implementing any information covered in our articles or tutorials. If this is a hardware review, it is not recommended to open and/or modify your hardware. |
More Web Development Articles
More By Developer Shed