Dynamic Site SEO Tips and Hints
Dynamic sites require highly specialized search engine marketing strategies that differ from those used for static sites. It’s still hard to get dynamic sites indexed unless they’re properly optimized. While search engines say they now index dynamic sites, and they do, many times it doesn’t happen without a little help. And certainly the positioning of pages is another issue altogether.
There are a number of strategies that can be used to convert your dynamic URLs into search engine-friendly URLs. Before we get into that, let’s look at how the dynamic databases used by ecommerce sites and other large sites are created, and why they’re hard to index.
What Keeps Dynamic Sites Hidden?
Dynamic pages are created on the fly with technology such as ASP, Cold Fusion, Perl and the like. These pages function well for users who visit the site, but they don’t work well for search engine crawlers.
Why? Because dynamically generated pages don’t actually exist until a user selects the variable(s) that generate them. A search engine spider can’t select variables, so the pages don’t get generated -– and can’t be indexed.
The big problem is that crawlers such as Google can’t read the entire dynamic database of URLs, which either contain a query string (?) or other database characters (#&*!%) known to be spider traps. Because search crawlers have problems reading deep into a dynamic database, they’ve been programmed to detect and ignore many dynamic URLs.
We recently increased a client’s search engine potential from 6 to 659 pages. Considering that Google saw only half-a-dozen pages originally, we think hundreds of optimized pages will significantly increase our client’s search engine visibility.
Making Dynamic Sites Visible
There are a few dynamic-page optimization techniques that can be used to facilitate the indexing of dynamic sites. The first that comes to mind is to make use of static pages. There are also many methods that convert dynamic URLs to search engine-friendly URLs. Another good way to achieve wider visibility is to use paid inclusion and trusted feed programs that guarantee the indexing of dynamic sites, or a specific number of click-throughs.
Static Pages
Place links to your dynamic pages on your static pages, and submit your static pages to the search engines manually according to each search engine’s guidelines. This is easily done with a Table of Contents page that displays links to dynamic pages. While the crawlers can’t index the entire dynamic page, they will index most of the content.
Active Server Pages (ASP)
XQASP from Exception Digital Enterprise Solutions is an excellent tool for converting dynamic ASP pages into search engine-compatible formats.
For example, the following URL contains both "?" and "&," making it non-indexable:
http://www.planet-source-code.com/vb/scripts/ShowCode.asp?lngWId=3&txtCodeId=769
Below, it has been made search engine-friendly (all "?" and "&" and "=" characters replaced with alternate characters):
http://www.planet-source-code.com/xq/ASP/txtCodeId.769/lngWId.3/qx/vb/scripts/ShowCode.htm
Once you’ve converted the URL, don’t forget to use search engine optimization techniques to modify the HTML tags and the content within the tags, before submitting all your pages in accordance with each search engine’s submission guidelines.
ColdFusion. This might be an easy fix. Reconfigure ColdFusion on your server so that the "?" in a query string is replaced with a "/" that passes the value to the URL. Of course, you’ll still have to optimize your pages and make your site respond quickly when a crawler does come by for a visit.
CGI/Perl
Path_Info
andScript_Name
are environment variables in a dynamic application containing the complete URL address, including query string information. The solution is to write a script that removes all the information before the query string, thereby making the remaining information equal to a variable, and then using that variable in your URL address. Again, optimization is required to show up in the top editorial listings.Apache Software
The Apache server has a rewrite module (
mod_rewrite
) available for Apache 1.2 and beyond that converts requested URLs on the fly. You can rewrite URLs that contain query strings into URLs that can be indexed by search engines. The module doesn't come with Apache by default, so find out from your Web host whether it's available for your server.Paid Inclusion Programs
Most major search engines now offer paid inclusion and trusted feed programs based on refresh indexing or cost-per-click. Engines that offer such programs include AltaVista, AskJeeves, FAST, Inktomi, LookSmart, Lycos, and Teoma.
These programs alone are not good enough for search engine positioning. To have your dynamic sites indexed through XML feed, first ensure the site is properly optimized using professional optimization techniques.Good search engine optimization (SEO) contractors have access to Web-based automated feeds. These offer creation and management applications that generate XML-optimized feeds for multiple search engine inclusion programs. Professional SEO technicians can map any large ecommerce site's entire catalog, generating an automated XML-optimized feed.
The key to this XML procedure is keyword matching between the dynamic site content and various search engine databases. Using special filters and parameters, this process then generates thousands of keywords with page-specific meta information. The result is a distinctive representation of each product page on the target search engine, and a more accurate representation of your dynamic site, services, products, etc.
Key to Search Engine Visibility
Internet users search with mind-boggling combinations of your strategic keywords. Users at various search levels must find you before they find your competition. That's why keyword analysis and research is so important to the success of your SEO campaign. Professional optimization techniques such as copywriting, coding, and editorial linking are also important for top positioning on major search portals.