I was chatting with a friend, who was telling me about an ex-client of his.

The client had a site re-design by a family member, and though he had done a lot of link building, the website wasn’t ranking anywhere. His site wasn’t even listed in Google. Was the site banned? On inspection, John Zumwalt found the designer had put a robots.txt file on the site, blocking all robots. A simple mistake, but difficult to spot if you don’t know what you’re looking for. Sometimes professional help is what is needed.

If your site is not being crawled by Google or other search engines, here’s a simple checklist to follow:

Robots.txt may exclude spiders

Check to see if you have a file called robots.txt. This can appear in any directory but is usually found in the root. Either remove the robots.txt or make sure it conforms to the following robots.txt standard.

No inbound links: 

Search engines crawl the web, following links from page to page. If you don’t have a link pointing to your page from a page that is already included in Google, it is less likely that Google will find your site. Submit your site to a directory, ask a friend for a link, or beg, borrow or buy. It pays to get links from reliable sources, as opposed to link farms, which Google may discount.

The site may have technical issues: 

The server may be set up incorrectly, your site may contain code that makes crawling difficult, etc. Luckily, Google offers a reporting tool in the form of Webmaster Central. Use Sitemaps and the Site Status Wizard to help determine potential problems.

No deep crawl

Google crawls the site but doesn’t find many pages. Check your linking structures to ensure that important pages are well linked. You may wish to use a pyramid site structure to help organize your site thematically. Remove, or alter, duplicate content. Increase the quality of inbound linking, and avoid poor quality outbound linking. See Matt Cutts comments roughly 3/4’s of the way down.

Flash, Scripting:

Google can have problems following animated and coded links. It is safest to provide an all HTML version of your site if using Flash. Google is getting a lot better at following scripted links, however, be sure to check with Webmaster Central if problems persist.

Site Ban:

It’s unlikely, but possible, that your site may have been banned. Check with Webmaster Central, and if a ban is in place, try submitting a re-inclusion request. Here’s the definitive guide on submitting a re-inclusion request, straight from the horse’s mouth. Essentially, Google wants to know that the problem has been corrected, and it won’t happen again.