How does your site look to the Search Engine Bots?
Search engines rely on automated software agents called spiders, crawlers, robots and bots. These bots are information seekers on the Internet.
For search engines to be able to index information they need this tool to visit the websites. They navigate the Web site, figure out what the Web site is about and then add the data to their index. Search engine bots have the ability to follow links, gather information and report back to the engines. If it does its job properly, then the search engine has a good, valuable database, or index, and will deliver relevant results to a visitors query.
Having a successful Web site factors in many variables. One thing we all need to concentrate on is making sure the search engine bots are finding what they are looking for when visiting our site. Search bots are looking for two major things when they spider a site - text & links.
Being able to give the bots what they are looking for truly will increase a Web sites success in the long run.
If you want to see what the bots see try this cool program called Spider Viewer.
http://www.webuildpages.com/seo-tools/spider-test/