Fetch as Google Problem

Hi, are your familiar with “fetch as google”?

Is there any way to submit more than 500 URLs in a month?

My website have more than 1k URLs that I wanted to submit this month.

Please let me know if you can do any help in this.

Thank you :smile:

For your information, Google has a weekly quota of 500 fetches. But I would suggest you to fetch only those pages that aren’t getting indexed in Google instead of complete website pages.

Are you sure this quota is weekly?

because I have read somewhere it’s 500 monthly.

As far as I can see Fetch as Google has nothing to do with submitting a URL. It is a tool for you to see how Google fetches your page, nothing more.

Google states it is 500 per week.

See https://support.google.com/webmasters/answer/6066468?hl=en for more information.

This article might also help: https://support.google.com/webmasters/answer/6066468?hl=en

This is true only for the first step of the tool, “Fetch” and “Fetch and Render”. Which is to allow you to see how Google sees your code before submission. The next step is “Submit to Index” once you a happy with how it is fetched. This submits to Google and request a crawl or re-crawl of new or updated content.

Ask Google to re-crawl your URLsIf you’ve recently made changes to a URL on your site, you can update your web page in Google Search with the Submit to Index function of the Fetch as Google tool. This function allows you to ask Google to crawl and index your URL.

I find it very effective at getting fresh content indexed quickly.

It is 500 per month for single URLs, and 10 per month for a URL ans its direct links:

Crawl only this URL to submit one individual URL to the Google for re-crawling. You can submit up to 500 individual URLs per month in this way.

Crawl this URL and its direct links to submit the URL as well as all the other pages that URL links to for re-crawling. You can submit up to 10 of requests of this kind per month.

You will need to be patient rely on Googles natural crawling process. If your site has a proper link structure, the bots will follow the links from one page to the next and eventually crawl all linked pages on the site. Though there is no guarantee they will index every page they crawl.

2 Likes

Have you created an XML sitemap listing all the pages?

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.