Likes and dislikes of googlebots

 

What's a Googlebot? It's one of the little web-searching spiders (automated) that I talked about in the last section. And these spiders have definite preferences, so you want to make sure your content is good spider food.

Spiders like:

  • Neat code - less lines of code than lines of text (or more lines of text than lines of codes.)
  • Normal keyword densities of 3-7%.
  • Lots of backlinks on pages that link back to your home page. (Top sites have an average of 300 backlinks.)
  • Original content not found anywhere else.
  • Quick downloads of sites, which means not a lot of dynamic URLs to other sites.
  • Site maps.
  • ALT Tabs for images.
  • Link partners who are contextually relevant to your page (i.e., if your page is about buying real estate, links might about be how to get loans, how to prospect for deals, how to start a corporation…but not about pet gerbils, latest fashions, or cell phones.)
  • New content every time the spider comes to check up on your site.

Spiders do not like:

  • More lines of code than text.
  • Nested tables.
  • Super-high keyword densities, which they call "keyword stuffing".
  • "Doorway pages" that act as a portal and which just happen to have super-high keyword densities.
  • Too many backlinks to your home page from within your domain.
  • Duplicate content from another site-regardless of who stole what from whom.
  • Lots of dynamic URLs that cause a site to take forever to download.
  • Repeating the exact same words in your linking text, which the spider will interpret as automated link swapping. (Interestingly, it's fine for the spiders to be fully automated, but they hate it when we do that!)
  • Stale content that never changes.



« Things to consider Chapter 4: About specific keyword density ranges »