I need solutions on sitemap for google webmater
Is it enough for google webmaster center to submit txt format sitemap ( if we have more than 45000 URL)OR we need to submit xml sitemap to crawl all the our webpages
It seems to lay out the situation pretty clearly. It seems a text file (sitemap.txt) of no more that 50,000 urls and/or no bigger than 10M in size if fine.
It does seem to indicate that an xml version is better for other search engines, though.
(Warning: I just did my first real sitemap today, so I’m no expert! )
Even we have submitted xml sitemap to google webmater , number of indexed pages in webmaster is reduced everyday.
please suggest why is happen like this, before 1 week indexed pages are 15,000 but now its showing 13,000 pages
Agree with ralph.m . Your site map should not have more than 50,000 links & file should not more than 10MB. Mostly developers use .xml for site map & .txt for robots file. Regarding your previous post I think you have given more than 50,000 links in your site map that’s the reason you lost your indexed pages. If, you supposed to give more than 50,000 links split the site map & create a site map index file for that. Read the web master guidelines till you get a solution. It only gives you an exact solution.