2/07/2006 09:14:00 AM
Posted by Andrey Stroilov, Google Engineering
The new features released yesterday include a list of common words in your site's content and in external links to your site. In some cases, these common words may not match what you expect from your current content. The common words are calculated based on the results of the Googlebot crawler. This can affect the data in a number of ways:
- Googlebot hasn't crawled all pages on your site. If words on particular pages are missing, make sure that those pages are being successfully crawled. If those pages are not yet in your sitemap, adding them will help guide Googlebot to those portions of your site. Also, make sure that you link to the pages of your site from within your site (for instance, by using an HTML site map).
- Your site has changed since we last crawled it. If you have just redesigned your site, made significant content changes to an existing site, or purchased an existing domain and changed the contents, the data will not be updated until Googlebot has successfully crawled the new or changed pages.
- Googlebot is unable to re-crawl modified pages. Googlebot may be unable to access your server due to network error, or may encounter a server error when trying to load your pages. Make sure that your server is responding properly to incoming requests.
- Your site is not being crawled. New sites may take some time to be fully crawled. Read through our information for webmasters for more information about our crawling processes. Also, make sure your site doesn't violate the Webmaster Guidelines.
If you are seeing unexpected data and none of these apply to you, please let us know by posting in our Google Group
. We always appreciate user feedback and are working to improve Google Sitemaps.