DNS and Browser caching

Hi everyone
I am trying to understand the ‘order’ of caching, and caching in a bit more detail, and have a real world example.

We control a number of websites. a few weeks ago some of the smaller websites were down for a day or so for maintenance. The main website was still live, so to give users a better experience, we changed the DNS records of all the websites that would be unavailable to point to a website maintenance page on the main website.

The TTL was set to 24 hours and the redirect worked well. After the downtime we reverted the DNS to point back for the now working websites.

However the issue I have now is that even 2 weeks on some people are being redirected to the maintenance page still.

I understand that browsers, providers, DNS servers and so on all cache DNS. However the TTL has long expired on the change.

I also understand that browsers cache static files etc of pages that have been visited previously which are used if they have not expired.

So I think my question is:

  1. Any ideas why users would still be seeing the maintenance page?
  2. What ‘order’ does a browser request content? Do they check the DNS cache and work up until they find a DNS record to use? then once they have that check the cache details for the content and use any locally stored assets if they are appropriate?

A small amount of users are having trouble seeing the correct content and I am unsure why.

Many Thanks for any information or advice.

How exactly did you do that? There is no to redirect to a specific URL through DNS. Was there some form of HTTP redirect in play here? If so, was it a 302 or a 301?

In many cases TTL is treated as a suggestion, not as a rule. AFAIK browsers in general will honor the TTL, but providers, VPNs etc may not (and cache for longer, or shorter periods of time).

Two weeks seems excessive through.

Is there some commonality between the users experiencing this issue? Do they all work at the same company, or of the same internet provider, or use the same browser, or … ?

Perhaps try adding a random parameter to the stylesheet and/or images which may defeat the cache because the page would have changed.

For example:



I’ll get the specifics from the team that did it - the DNS was changed but I’ll find out how the traffic was routed to the specific URL on the site.

Seems that safari is the biggest issue, I need to dig around in analytics a little more… the hard part is working with willing customers to tell us that information :slight_smile:

Thanks John

the issue is that people are seeing an up to date page, but they should never get that far and they should see a page on an entirely different website.