SitePoint Sponsor |
|
User Tag List
Results 1 to 10 of 10
-
Jun 21, 2002, 05:03 #1
- Join Date
- Mar 2002
- Location
- Central, PA - originally from Monterey, CA
- Posts
- 497
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Search engines read the first *K of the page?
I read an article or a thread somewhere on SitePoint that mentioned some search engines may only read the fist *K of data on a page.
I have not been able to find that info again. Does anyone know if this is true, and if so, what is the actual typical amount, and do you know of any reference to verify this?
Thanks in advance,
worksdev
-
Jun 21, 2002, 05:24 #2
- Join Date
- Aug 1999
- Location
- East Lansing, MI USA
- Posts
- 12,937
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
This is one of those things that WAS true. Now that bandwidth and storage is cheaper many engines will read the entire page.
Which ones do and which ones don't? I honestly don't know, except I know that google atleast reads the whole page.Chris Beasley - I publish content and ecommerce sites.
Featured Article: Free Comprehensive SEO Guide
My Guide to Building a Successful Website
My Blog|My Webmaster Forums
-
Jun 21, 2002, 05:41 #3
- Join Date
- Mar 2002
- Location
- Central, PA - originally from Monterey, CA
- Posts
- 497
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Thanks Aspen.
That’s good to know.
I was thinking about this issue in light of the amount of JavaScript contained in the head of an html page.
At first, I was thinking that if a search engine only reads the first *K of a page, then it should be a definite practice to put the JavaScript in a .js file using the src tag.
I am now wondering if too much JavaScript will still affect keyword phrase relevancy by pushing the important content down the page a bit.
Do you think having a lot of JavaScript in the page head will affect the relevancy of keyword phrases contained in the body?
Best Regards,
worksdev
-
Jun 21, 2002, 07:04 #4
- Join Date
- Aug 1999
- Location
- East Lansing, MI USA
- Posts
- 12,937
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
you should always put JS in a separate file. Its very easy to do and so even if it only helps a little bit, or not at all, it's better to be safe than sorry.
Chris Beasley - I publish content and ecommerce sites.
Featured Article: Free Comprehensive SEO Guide
My Guide to Building a Successful Website
My Blog|My Webmaster Forums
-
Jun 21, 2002, 08:15 #5
- Join Date
- Mar 2002
- Location
- Manitoba, Canada
- Posts
- 50
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Actually, Google will only spider the first 100k of a page. That should be ample space though...
Michael Saganski | Living-Your-Life.com <- Earn Residual Income, Improve Your Health, and Increase Free Time!
-
Jun 21, 2002, 09:03 #6
- Join Date
- Mar 2002
- Location
- Central, PA - originally from Monterey, CA
- Posts
- 497
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Originally posted by Fizzleboink
Actually, Google will only spider the first 100k of a page. That should be ample space though...
Thanks:
worksdev
-
Jun 21, 2002, 09:30 #7
- Join Date
- Mar 2002
- Location
- Manitoba, Canada
- Posts
- 50
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
There have been several discussions about it on WebmasterWorld.com, although I don't think there has been any official statement about it by Google.
It can be easily proven though. Simply make a page that is 150k, and notice that Google will only cache the first 100k of it. The rest is ignored.
One such thread on this discussion that I quickly found can be found here:
http://www.webmasterworld.com/forum3/3482.htmMichael Saganski | Living-Your-Life.com <- Earn Residual Income, Improve Your Health, and Increase Free Time!
-
Jun 21, 2002, 10:10 #8
- Join Date
- Aug 1999
- Location
- East Lansing, MI USA
- Posts
- 12,937
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
So it seems... If your pages are that long though you definitely should do something about it - people don't like to scroll that much. I put the entire chapter of a single book on one page and still I rarely go above 60k, and I couldn't find a single page on my site over 100k. I'm sure there are a couple though.
Chris Beasley - I publish content and ecommerce sites.
Featured Article: Free Comprehensive SEO Guide
My Guide to Building a Successful Website
My Blog|My Webmaster Forums
-
Jun 21, 2002, 11:15 #9
- Join Date
- Mar 2002
- Location
- Central, PA - originally from Monterey, CA
- Posts
- 497
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Good info, Fizzleboink.
Thanks.
-
Jun 21, 2002, 12:22 #10
- Join Date
- Mar 2002
- Location
- Manitoba, Canada
- Posts
- 50
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Yeah no problem. Very few sites would ever have to worry about that 100k limit, and if a page ever goes 100k, simply split it into two seperate pages. Google seems to like shorter pages (so I've heard), and people tend to prefer short manageable chunks of text to read. People may get scared of a page if it has too much text to read, and move on elsewhere.
I don't know about other spiders like Inktomi though, it is probably around the same limit.Michael Saganski | Living-Your-Life.com <- Earn Residual Income, Improve Your Health, and Increase Free Time!
Bookmarks