Evil HashBangs

http://isolani.co.uk/blog/javascript/BreakingTheWebWithHashBangs

Without js enabled these types of sites direct to a generic URL like /?noscript=true
essentially breaking every link on the website if javascript is disabled.

history.pushState solves this problem of navigating an ajax powered site.

But, what do we do without that feature?

a) Live with the URL not changing
b) Don’t load content with ajax
c) use a hashbang
d) make js a required technology
e) something else

I can’t think of a bulletproof solution that loads content with ajax without using the new history API. Can you?

What I see is, if these things spread, Google will be able to access ajax’d content while its competitor search engines can’t.

Just figures someone had to figure out how to break simple URLs with Javascript too : (

Sounds like #! means = stop do not try to access this web site as we are morons who don’t have a clue as to how to build an accessible web site and our site will probably be down when you try ot access it because we built it with as many ways to have it fail as we possibly could.

Turning JavaScript off for such web sites would appear to be the only way to get consistent results.

The reason the hashed url’s are used are to make the user experience better.

The way I chose to go with implementing this is the same as @belupton’s https://github.com/balupton/History.js/

Make the site work without js, every URL returns the full page + content.
When the pages are requested with a XMLHttpRequest request header return the content. only.
Use pushState in browsers that support it.
Use a hash fallback /#/

The only reason this pattern fails is when someone tries to access a hashed url without javascript enabled. This is not great, but I think the better user experience is worth that one drawback. The amount of real users who would see this is negligible.

All in all - it’s not necessary - but if you want to do it, this is the best way I have found.

I got a NoScript update today. By default a tab opens to their page where they list changed. #2 on the list:

Automatic fallback for some types of AJAX-rendered web pages (e.g. on Gawker’s sites) via Google’s escaped_fragment recommendation.

I notice some sites, like DuckDuckGo, will offer other URLs depending on whether you have Javascript on.
However most webmasters wouldn’t bother having anything else BUT Javascripted URLs. Looking at how 99% of sites are built today, I predict I will be correct.

The amount of real users who would see this is negligible.

Well, everyone already knows how I feel about stickin’ it to minorities just because they are unpopular minorities…

I’ve been wondering if anyone (vendors) have been actually considering this. There’s probably a lot of good reasons not to, but I have wondered.