Big three search firms ally behind SiteMaps

Microsoft, Google and Yahoo have agreed to work together on SiteMaps, the webmaster search engine optimization specification developed by Google.
In a rare show of solidarity, the three fiercely competitive web portals announced SiteMaps.org, a page where the latest version of the spec, and instructions for adhering to it, can be found.
SiteMaps was created by Google about 18 months ago and released under a Creative Commons license. The version announced yesterday is numbered v0.9, indicating it's not quite done yet.
That said, Yahoo and Microsoft supporting it indicates adoption has been broad, as webmasters around the internet seek to improve their Google ranking.
SiteMaps itself does not automatically give your pages a higher ranking. Rather, it tells search engine crawlers what pages you have available for indexing and how frequently they are updated.
In that regard, SiteMaps is much like the inverse of robots.txt, the much older standard for telling search engines and other scripts to keep out of your server.
"You just publish a sitemap, and every engine is instantly able to read and use the data to more effectively index your site," Ken Moss general manager of Microsoft's Live Search wrote on his blog.
A SiteMap is a straightforward XML file that you place in the root directory of your web server. Each URL you include can have up to three values associated with it - the last time it was modified, how frequently it is updated, and the relative priority. The "changefreq" value can be set to "weekly" or "daily" or "never" for examples, indicating how often you would like the search engine to crawl it. The "priority" should be set to a numerical value from 0.0 to 1.0, with a higher number indicating a more important page.
The system is not designed for hacking the search engine, however. The protocol is a one-way channel. You tell the crawler what you would like it to do, but each individual search engine is free to disregard that instruction.
A "daily" updated page that is never updated may not get crawled daily for long, for example. And the "priority" value is relative to only pages on your domain, not to pages at other sites. While the spec calls for search engines to enable SiteMaps to be submitted with a simple HTTP request, each firm is also rolling out sites to streamline the process.
Yahoo has already added SiteMaps support to its search engine. Maps can be uploaded on the same page usually used for adding RSS feeds. Microsoft said that it is still testing the spec internally first.

Comments

Anonymous said…
http://www.hongxiaowan.com
Sitebases, the next protocol after Sitemaps
It can save the time and press for the search engine, also for the websites.
It can bring new search engine that named Search Engine 2.0.
Using Sitebases protocol, will save 95% bandwidth above. It is another sample for long tail theory.
In this protocol, I suggested that all search engine can share their Big Sitebases each other, so the webmaster just need to submit their Sitebases to one Search Engine.
And I suggested that all search engines can open their search API for free and unlimited using.

Please visit: http://www.sitebases.org

Relevant Posts

Black Hat SEO You Might Be Using That Can Lower Your Ranking

Simple SEO Tips To Increase Search Engine Ranking

Search Engine Optimization and Search Engine Marketing

Managed Search Engine Optimisation