HTML and CSS Reference
he XML sitemaps protocol allows a webmaster to inform search engines
about website resources that are available for crawling. he sitemap.xml ile
lists the URLs for a site with additional information about each URL: when
it was last updated, how oten it changes, and its relative priority in relation
to other URLs on the site. Sitemaps are an inclusionary complement to the
robots.txt exclusionary protocol that help search engines crawl the Web more
intelligently. he major search engine companies—Google, Bing, Ask.com,
and Yahoo!—all support the sitemaps protocol.
Sitemaps are particularly beneicial on websites where some areas of the
website are not available to the browser interface, or where rich AJAX, Silver-
light, or Flash content, not normally processed by search engines, is featured.
Sitemaps do not replace the existing crawl-based mechanisms that search
engines already use to discover URLs. Using the protocol does not guarantee
that web pages will be included in search engine indexes or be ranked better in
search results than they otherwise would have been.
he content of a sitemap ile for a website consisting of single home page
looks something like this:
<?xml version='1.0' encoding='UTF-8'?>
<loc> http://example.com/ </loc>
<lastmod> 2006-11-18 </lastmod>
<changefreq> daily </changefreq>
<priority> 0.8 </priority>
In addition to the ile sitemap.xml, websites can provide a compressed ver-
sion of the sitemap ile for faster processing. A compressed sitemap ile will
have the name sitemap.xml.gz or sitemap.gz. here are easy-to-use online
utilities for creating XML sitemaps. Ater a sitemap is created and installed on
your site, you notify the search engines that the ile exists, and you can request
a new scan of your website.