By Matthew Magain

Cross-site Ajax in Firefox 3

By Matthew Magain

John Resig has posted a good summary (including demo code) for how one might implement cross-site XMLHttpRequest calls, a feature currently implemented by the beta 2 release of Firefox 3.

In a nutshell, there are two techniques that you can use to achieve your desired cross-site-request result: specifying a special access-control header for your content, or including an access-control processing instruction in your XML.

What’s particularly exciting is the code that is required to take advantage of this feature. For example, to request an HTML file from a remote domain, you might do the following (you’ll need to download Firefox 3 first, of course):


var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function() {
  if ( xhr.readyState == 4 ) {
    if ( xhr.status == 200 ) {
      document.body.innerHTML = "And the winner is... " + xhr.responseText;
    } else {
      document.body.innerHTML = "ERROR";
};"GET", "", true);

Look familiar? Aside from the inclusion of the domain in the URL parameter of the open function, this code is identical to the standard Ajax calls that you are probably already making.

Of course, whether cross-site Ajax requests are a Bad Thing™ or not is a debate that will no doubt rage for years up until enough browsers support the functionality for it to be actually useful. Once we reach that point, it’s my bet that a whole world of new mashups, apps and other services will open up (and, yes, people who don’t understand it will no doubt do stupid things with it, as they did when Ajax became the new hotness a couple of years ago).

Read the official documentation on the Mozilla Development Center for more information (and maybe check out the documentation for some of the other features to come while you’re there).

  • Ruben K.

    Is this a thing the Firefox team came up with? Will this be implented in browsers other than Firefox? If not, this is sort of useless if it’s a Firefox ONLY thing

  • It’s browsers doing this kind of thing that prompts other browser makers to follow suit though. Except IE of course, they will just go and do their own thing as usual, and be a complete pain in the process.

  • Is this a thing the Firefox team came up with? Will this be implented in browsers other than Firefox?

    The functionality implements the Access Control spec, which is a working draft from the W3C, so no, it’s not some random new tangent. In fact, the XMLHttpRequest working draft explicitly states “A future version or extension of this specification will most likely define a way of doing cross-site requests.”

    That said (as Stormrider mentioned) how do you think we came to be using XMLHttpRequest in the first place? It was an IE-only object before it caught on; the W3C then had to play catch up (and still are).

  • McKenna

    There’s a typo in the link for the FireFox 3 for developers link. You need to remove the final forward slash:

  • Thanks McKenna, all fixed (grumble grumble something about stoopid weird wiki systems breaking proper URI syntax)… :-)

  • php_penguin

    So why dont you just use PHP (or other SS language) as a proxy so it works with everything?

  • I have to agree with Ruben. If this is not yet supported across most browsers (not even FireFox 2) then it is years off from practical use, am I wrong?

  • Thierry
  • Dan

    I’d have to agree with php_penguin.. I don’t see any advantage to this other than simplicity. (Not that the alternative is really that hard…) By the way there is a great PHP class called snoopy, (source forge) that ties
    in nicely with what we’re talking about…

  • But doing that in PHP means *you* use, and pay for, double the bandwidth of each request (download it from the remote server then serve it to your user). If you could do it with JS alone, the client takes the burden of the bandwidth bill. And it also prevents the 3rd party site from shutting all your users out by banning your IP — each request is coming from the individual browser.

  • Dan

    @Dan Grossman – Great point on the bandwidth issue. However I don’t see the latter point as being all that strong. Just because.. and correct me if I’m wrong.. but it seems to me if you were utilizing a third party site / service it would most likely be open and/or you’d have permission to use it. Still a relevant point though.

    If you delve in to the linked article it looks like they are working on implementing some simple access control for xml files, so that you can either allow anyone or only specific domains. pretty cool.. I haven’t seen that before.

  • nathj07

    Desktop software has been able to do this sort of thing for while now. I think that is really a great step forwards. As a developer of many sites that often share information it would no doubt make my life easier. An obvious implementation is to have a page that returns XML with the standard data in it. The receiving page would then read this and format it using it’s own CSS.

  • dnix

    Some of you guys are being fanboyish…IE has a rich administrative feature set that allows the cross domain issue to be managed atomically. FireFox got this wrong the first time around and enough folks with real jobs have complained about it all over the web to make it an issue.

  • anonymous

    Cross-site XmlHttpRequest feature is not supported anymore. Firefox 3 build 5 doesn’t support it!

  • Chief Odie

    I don’t see any advantage to this other than simplicity.

    To add to what Dan said, it’s also a case of security… You may be consuming a service that requires your clients to log-in individually. This allows the service to manage it’s own security without your site being the “Man in the middle”


Get the latest in JavaScript, once a week, for free.