Results 1 to 3 of 3
Dec 16, 2009, 11:01 #1
Site Indexing and Folder Security
I have excluded a directory in my robots.txt and site map. But I would like Google to index those pages but if the pages have any secure documents linked on them such as PDF/Word files on my server I do not want that directory access. As an interim solution I am not indexing a large part of my member-only site. How can I secure the folder yet have the member pages indexed? I have read the solution of putting my /folder/ outside the Web accessible folder but i do not know how to configure this. My member pages have an authentication script with checked for cookies and passes variables. Will the Google spider get to these pages regardless? I have an IIS server.-=SunnaH=-
Dec 16, 2009, 15:42 #2
- Join Date
- Aug 2000
- Philadephia, PA
- 1 Post(s)
- 0 Thread(s)
Put the files you don't want Google to access in a folder, and disallow that folder in your robots.txt. It's those files you want to prevent indexing, not the pages that might link to them.
If you already have authentication set up to prevent direct access to those files without being logged in, then neither Google nor any other search engine can access those files regardless of your robots.txt.Try Improvely, your online marketing dashboard.
→ Conversion tracking, click fraud detection, A/B testing and more
Dec 16, 2009, 19:38 #3
Just to clarify:
The pages and the directory of files I do not want Google to spider are declared in my robots.txt file. The pages have an authentication script to check if the login cookie is there only. I wanted to make sure if I allow the Google spider to index the protected pages only that the files in the folder of files which are linked from those pages will not be exposed as well to the Google spider. In other words is it safe to allow the index of the protected pages without access to the files in the folder?-=SunnaH=-