Show Bots the Content to Index

There are a few ways to tell bots whereabouts the content is on your web page: With Google you simply add comments around the content to incude or exclude. To include, add these comments:
<!-- google_ad_section_start -->
Here's my content
<!-- google_ad_section_end -->

With hAtom you need to add classes (and maybe extra markup) to identify the correct meanings of certain pieces of content. See this page for examples: hatom-examples - Microformats Wiki.

<div class="news">
 <div class="hfeed hentry story">
  <h1 class="entry-title">China confirms new bird flu outbreaks</h1>
  <span class="byline">Last Updated
  <abbr class="published" title="2005-11-23T04:26:18Z">Tue, 22 Nov 2005 23:26:18 EST</abbr>

  </span>
  <address class="vcard author"><span class="text"><a class="url" href="/news/credit.html">CBC News</a></span></address>

  <div class="text">

   <p class="entry-content">

 </div>
 ...
</div>

Like hAtom but as a negative, Yahoo ask you to add a class to your HTML elements that you don't want indexed. For example:

<div class="robots-nocontent"> This is a section where ads are displayed on the page. Words that show up in ads may be entirely unrelated to the page contents</div>

For more background reading, see the ySearchBlog: Introducing Robots-Nocontent for Page Sections.

All of these methods would appear to be compatible with each other. Put comment code around sections you do or don't want indexed, apply specific classes to HTML elements with specific meanings, and apply a specific tag to HTML elements you don't want indexing.


Comments

It's quiet in here...Add your comment