SEO help is needed here....!

Hi All…

I have a little question in my mind regarding my sitemap in webmaster tools…it gives me the results

Submitted URLs 3,450
2,405 URLs in web index

can u please explain my following doubts:
1- why 1000 urls are not in web index
2- these figures keep on changing - wht does it means.
3- do i need to add every url in my sitemap whenever i made a new page…if that new page is not linked from homepage.
4- how i can improve on the baove numbers…

Help me out guys…thanks in advance for ur help.

It will take the time to index your all the pages, although I think change the sitemaps every time when you add some pages in there, next update the sitemaps too.

Google has clearly stated that even after submitting an XML sitemap, not all of the URLs submitted would get indexed and this is natural of course because Google has its own method of evaluating the pages strength before attempting to index them. Of course ideas like addition of links to the websites and also updating the contents with fresh materials could be useful.

If your pages can only be reached via a Javascript-activated link, you are at risk of excluding customers as well as search engines, and it’s a huge accessibility problem. You should always ensure that there is a route from the home page to each and every internal page using only simply <a href=“folder/page.htm”> style links. You might not want to use that as your main navigation, but you have to have it there as a back-up for people who haven’t got Javascript available or turned on.

We have well linked our internal deep pages throught drop down only and not in text…means all the cities under states have been link through using a dropdown. There is a { javscript function/property on change set timeout } has been set on the drop downs…is this the thing thats causing the bots to read the deep internal pages…wht can i do at this point…

Should i link the city pages in textual only…My site it totally dynamic using aspx.How can i improve the internal linking strcture of my website.

Please reply so that i can work on it…thanks a lot

A good internal link structure and passing PR to the internal pages and maybe external links to the inner pages is about all you can do.

What are the other ways available so i can help in indexing of these 1000 pages…Please help me

tell me how can i help bots crawl my inner pages.

thanks a lot…

sitemap is just a one way to help google to index your site faster… that’s normal… you can try other way to help bots crawl your inner pages… like deep linking…

Hi Buddy,

If you will add new link, Mostly chances are that Google Bot will automatically index your pages. Some time Google bots index those page which are not even linked with any of the page but kept in directory. So don’t need to worry.

But If you wish to add, you can add them manually, or can regenrate the xml.

Google doesn’t always index all your pages, that’s all it is.

Yeah, what Stevie said, they’re called ‘redundant text links’ and you should be providing them for accessibility reasons anyway.