Google sitemap - weighing benefits

Looking at creating the sitemap under webmaster tools. Will this help a new site get recognized and indexed by Google?

I’m asking this due to the time it takes to create one for a number of sites.

Just trying to weigh the benefits. Is it most helpful for dynamic sites rather than static and will it make updated pages become ranked faster?

Yes - this is it’s primary purpose - to aid discovery and indexation of new/existing/updated content on the web.

No - URLs in a sitemap have no influence over how well that page ranks in the results.

Yes, Site map is the first activity after online website.It is road map for Google ti visit your website.Because of it Google index your website as soon as possible.

Yes, I would say that sitemaps are very important. You can read all about them on the google site: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156184

if you want to index and rank your site faster… i suggest to read google webmaster guidelines: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769

Following these guidelines will help Google find, index, and rank your site.

Then the sitemap(s) within htaccess at the same time? I’ve wondered how much this technique helps.

Totaling 3 parts of the sitemap:

a) the internally linked sitemap(s)
b) sitemap submitted to Google.
c) sitemap(s) within htaccess

In the past, I’ve relied on the internally linked site map(s).

Question; for sites like job boards/forums/blogs, etc where timeliness is more important, does G not only index the updated and new pages faster with a submitted map, but give it some priority ranking?

Are you mistaking htaccess with robots.txt? I haven’t seen any reference to htaccess in this thread. You can however refer to your XML sitemap in your robots.txt file. Since this is the first file a crawler visits to see if it has permission to proceed with discovery/crawl/indexing your site, pointing out where your XML sitemap resides is a really good way to ensure that gets included.

You may see the QDF (Query Deserves Freshness) algorithm come in to play on forums and blogs due to the time-sensitive nature of the content. This, however usually fades after a while.

I’ve found that creating a sitemap using the sitemaps generator, coupled with a good .htaccess to direct everybody to a www address of a domain (including any other domains for the same website) and a robots.txt, all submitted to Google by use of the Google Webmaster Tools application and registration with Google Analytics is the best route towards efficient indexing of a site - it shouldn’t take more than 15 minutes per site to do all this when it becomes part of a build routine.

Are you mistaking htaccess with robots.txt? I haven’t seen any reference to htaccess in this thread. You can however refer to your XML sitemap in your robots.txt file.
I added it since it has to do with sitemaps/indexing.

http://en.wikipedia.org/wiki/Robots_exclusion_standard see “Sitemap” heading on this page. This shows xml maps - but can you refer to multiple internal, html’s sitemaps similarly?

I still can’t find any reference to .htaccess :shifty:

And while you might get some success with referring to HTML site-maps, it’s intended for XML ones, since these contain the required directives.

seriocomic,

apologies, it is robots.txt, I did mistake it. I’m I ever off-track.

okay then – I’m still confused. Why does wikipedia show this directing to Google:

Sitemap…hostednews/sitemap_index.xml (google tag)

Wouldn’t it be more something like ‘mysitepath .xml’ (mysite xml tag)

IMHO sitemap.xml files are all benefit. The time it takes to create one is insignificant when you use a sitemap generator script to make it.

The time it takes to create one is insignificant when you use a sitemap generator script to make it.
Where might these precious things be found?

LMGTFY - [URL=“http://lmgtfy.com/?q=xml+sitemap+generator”]http://www.google.com/search?q=xml+sitemap+generator

datadriven, from the robots exclusion standard, what I’ve gathered is you can indeed use multiple sitemap: references, it’s worth pointing out though that like allow:, sitemap: is a proprietary extension and isn’t technically a legitimate part of the original standard (so support may be spotty), the main search engines all support it though so you’ll cover the big names - it’s just you shouldn’t rely on that alone. Google from what I’m aware of looks by default for a sitemap.xml in the base directory (like robots.txt) and therefore offering it as such would probably gain the best recognition. If you do want multiple sitemaps why don’t you just follow the sitemaps specification and use the inclusion policy for multiple sitemap documents (via the XML references - see Using Sitemap index files (to group multiple sitemap files)).

http://www.sitemaps.org/protocol.php

Good point. Just because you must have a siteindex.xml file if you have over 50,000
sitemap.xsd

.....
<xsd:element name="urlset">
  <xsd:annotation>
    <xsd:documentation>
      Container for a set of up to 50,000 document elements.
      This is the root element of the XML file.
    </xsd:documentation>
  </xsd:annotation>
 <xsd:complexType>
   <xsd:sequence>
     <xsd:element ref="url" maxOccurs="unbounded"/>
   </xsd:sequence>
 </xsd:complexType>
</xsd:element>
.....

doesn’t mean you can’t with less.