Google Webmaster Issues Need to be Fixed

While checking Google webmaster i clicked on “Blocked Resources”. I noticed most of the important pages i.e 52 pages showing errors.

Can anyone help me to fix this.

Errm, You want your wp-admin pages to be public?

1 Like

Can you please explain? What exactly you want to say?

Well, according to the image you posted, it shows that your wp-admin pages can’t be accessed by Google.

Assuming they are password protected so that only you or any other admin you may have for the site has access, this is desired isn’t it?

Would you really want anyone to be able to go to your ACP pages?

My robots.txt file is as:

User-agent: *
Disallow: /wp-admin/

So it might be the reason behind it. right?

Exactly the reason and exactly as it should be.

There is nothing there you want Google or anyone else to see.

Even if you took that out of the robots,txt file you would get Forbidden messages in GWT

IMHO better to have Google not bother trying than to try and to fail.

So what will be the final solution? Do i keep it same or remove the /wp-admin/ from robots.txt.

If you don’t want the blocked message, open up your site so that anyone can do whatever they want with it until your host shuts it down as a security risk.

Else understand that it is normal, expected, and desired, and don;t worry about the message for wp-admin pages.

Now i understand. I am keeping it. But i was worried about this message because of Google’s statement.

To prevent Google from showing the errors I think it is possible to prevent pages from being crawled.

Tak a look at GWT → Search Apperance → Sitelinks

1 Like

Yes, those - CSS, JavaScript, and images - if they should be able to be available to site visitors then they should also be available to Google.

But “secure” files such as Admin pages, database configuration files, “premium” PDF / image files etc. should not be.

I agree with you. Then what should i do?

Do i go with you 1st statement or 2nd.?

Both

How it is possible?

If i allow the access to Wp-Admin then 2nd statement will be false. (Because we cannot keep our files secure)

If i keep Wp-admin disallow then Google cannot access the important files like CSS, Javascript etc.

Those files should not be under the wp-admin folder. Typically they’re under the wp-includes folder.

That reference isn’t related to pages you don’t want crawled. It’s to do with the way Google displays a site in the search results.

To block a page from search results, the only sure way (as far as I know) is to add the “noindex” meta tag. https://support.google.com/webmasters/answer/93710?hl=en

2 Likes

So according to you, I have to allow the access to Wp-admin. right?

No, wrong. There should be no need for anybody other than site admins to have access to that.

What @Mittineague is saying is that you should ensure files which need to be accessed are not in that folder, and then allow access to them.

[quote=“Mittineague, post:15, topic:222488”]
Those files should not be under the wp-admin folder. Typically they’re under the wp-includes folder.
[/quote]So you would allow access to the wp-includes folder, but not wp-admin.

1 Like

If a site’s robots.txt file disallows crawling these resources, it can
affect how well Google renders and indexes the page, which can affect
the page’s ranking in Google search.

The admin files are no one else’s business but your own. You do not want them to be indexed or to rank, they should be kept private. So this warning does not matter, it’s saying they won’t index something that you don’t want indexed.

2 Likes

I got your point.