Think about it in a more logical fashion – first, what is the difference between the two?
/yourdomain.com/index.php?pageName
or
/yourdomain.com/pageName.php
vs.
/yourdomain.com/pageName.html
Do you REALLY think search engines are so **** stupid, they can’t figure that out? Magically treat them different? OF COURSE NOT. So on that front it’s ALWAYS been nothing more than people flapping their gums about things they haven’t taken the time to rub a couple brain cells together on.
That just leaves users actually typing in a URL by hand… how often do YOU do that? I’ll do the domain name occasionally, sometimes a subdirectory like /forums – but apart from that? Like I’m even going to remember it! So that’s really a nonsensical claim as well if you take the time to think about it. Bookmarks, speed dial, favicon based bookmarklets (like those built into Opera) – these exist so people don’t have to remember URL’s.
I’d also point out that using Apache and many other server softwares you can use things like mod_rewrite to redirect those long GET type requests into something simple. Take a look at my personal garbage site for example:
http://www.deathshadow.com/
and a few sub-pages:
http://www.deathshadow.com/pakuPaku
http://www.deathshadow.com/canvasDemo
http://www.deathshadow.com/glKernedFont
Those are ALL being routed through a single index.php using this .htaccess
RewriteEngine On
RewriteRule !\\.(gif|jpg|png|css|js|html|ico|zip|rar|pdf|xml|mp4|mpg|flv|swf|mkv|ogg|avi|woff|svg|eot|ttf)$ index.php
Whitelisting extensions it’s ok for Apache to handle, sending any other requests to my index.php, which then parses $_SERVER[‘REQUEST_URI’] to figure out which page content to show – or throwing a nicely styled error if it can’t find it.
http://www.deathshadow.com/library/common.php
for example, which is pointing at a file that actually exists on the server… that I don’t even allow users to run directly from the web. (just a bit of extra added security there). You can do this even faster if you have a static server running OVER Apache, since then anything the static content server doesn’t nab must be a code request.
It’s not even a matter of static vs. dynamic at that point, it’s a matter of how well written it is. You could even FAKE the .html extension on them, and then strip it off when parsing REQUEST_URI – basically calling files that don’t exist, then letting the mod_rewrite handler call the appropriate content.