Google XML Sitemap Generator Automation Stopped Working!

I noticed on scanning Webmaster tools that my sitemap is continually failing unless it is rebuilt manually, in which case it works fine.

After that, as soon as any update is made to a post and automation of it begins, it fails again with this message:

[B]The last run didn’t finish! Maybe you can raise the memory or time limit for PHP scripts. Learn more

The last known memory usage of the script was 111.75MB, the limit of your server is 256M.

The last known execution time of the script was 35.02 seconds, the limit of your server is 600 seconds.[/B]

The memory limit in php.ini is 128M and max _execution time to 60 seconds. Both should be more than sufficient.

Any ideas?

It sounds like you may have way too many entries in your sitemap.xml file.

A Sitemap file can contain no more than 50,000 URLs and must be no larger than 50MB when uncompressed. If your Sitemap is larger than this, break it into several smaller Sitemaps. These limits help ensure that your web server is not overloaded by serving large files to Google.

If this is the case, you should try a “Sitemap Index”


Last file size uncompressed was 1,999,484 (not sure what works out to?)

Latest error message was The script stopped around post number 23229 (+/- 100)

Not sure if this was the problem why the manual method would work?

@AmishPatel - my understanding is that you are most likely trying to use some kind of autoposts and then have autogenerate sitemap enabled at each new post. Hence, say if 5 or 10 posts are being generated automatically, then after each post it tries to create a new sitemap and that is most likely creating a load on the server and the site map generation fails. Try to either autogenerate sitemap on daily basis or reduce autoposts at a time and see if that works and most likely it should work.

I agree with MittineagueMittineague - it sounds as if you’ve reached a limit.

I suggest a few things:

  1. Split your sitemaps into two or more files
  • perhaps include say 15,000 entries in each file, and/or
  • use separate files for different content types to spread the load
  1. Look at your sitemap data output
  • are there any XML tags that cane be removed to reduce file size? Even removing one tag could save you many mb’s
  1. Remove old content from the sitemap
  • if you have a lot of old dated content search engines would have indexed them by now so you don’t really need them in the sitemap

@MittineagueMittineague and @bluedreamer - I do not think its an issue with limit, because manually generating the site-map works fine as per @AmishPatel. Only the automated generation fails and I think its most likely caused by trying to generate site-maps quite fast creating high resource usage and automatic site-map generation is failing.

…that’s the point, if the process is using a lot of server resources due to the size of the sitemap it will fall over