What does the output of time() depend on?

time() returns a unix time stamp, but on what basis is this time stamp produced? Is it the system time? Or some other time feature? And if it is the system time, and the system time is set to a timezone like PST, would it influence the output of time()? (What if your php.ini is set to CST as the default time zone but your system is PST?)

I essentially want to know under what circumstances the output of time() would be impacted.

It’s a true UNIX timestamp, which is always timezone independent and set to UTC (Universal Time Code?). This is equivalent to the Greenwich time zone.

http://php.net/time

Mmh… i’m hesitant to use the word ‘true’ UNIX timestamp. It’s a true unix timestamp RELATIVE to the current system clock. If the system clock is off (IE: It thinks UTC time is currently 12:00:05, but it’s really 12:00:10), the timestamp will be off. (and why NTP is a thing.)

That is what I am really concerned with. If time()'s UNIX timestamp output is relative to the system clock, then if the system clock is wrong, it will be wrong. But what if the system clock is set to another timezone like PST? (I’m on windows, but also need to know for linux as well). Does time() calculate UTC from the system’s PST? So if the system is set to the wrong timezone, the calculation of UTC would also be off? This is what I am imagining is happening, but I really don’t know for sure.

There is an additional concern about the calculation of UNIX time from UTC (if that is even what is happening). UTC and UNIX time are nearly identical, but I read that they may diverge by a few seconds due to leap seconds being calculated into UTC. So is UNIX time based on UTC without taking into account these leap seconds (so that UNIX time just equals UTC), or is UNIX time calculated from UTC while taking into account the difference?

The php article on time() is surprisingly lacking in any details!

That is what I am really concerned with. If time()'s UNIX timestamp
output is relative to the system clock, then if the system clock is
wrong, it will be wrong. But what if the system clock is set to another
timezone like PST? (I’m on windows, but also need to know for linux as
well). Does time() calculate UTC from the system’s PST? So if the system
is set to the wrong timezone, the calculation of UTC would also be off?
This is what I am imagining is happening, but I really don’t know for
sure.

IIRC Linux internally tracks time as a unix timestamp. Conversions to other timezones are applied afterward. So yes, your timestamps will be off if your system clock is off, but changes to the timezone settings will not affect it. Also keep in mind that most modern OS’es sync their clocks to an outside server these days, so its rare for an internet connected device to be off by more than a second, if that.

There are 86400 seconds in a day that doesn’t have a leap second added and 86401 seconds in those days where one is added.

By disregarding the leap seconds and simply using 86400 seconds a day the UNIX time will be a second out between when the leap second is applied and when the UNIX time is adjusted back a second to allow for it. It would only ever end up more than a second out if it wasn’t reset during a sufficient number of years for an additional leap second to have been applied.

If no adjustments have ever been made for leap seconds then the UNIX time would currently be 25 seconds slow.

Right, that’s what I found by reading about the differences. But I can’t find whether time() makes a UNIX timestamp based off of UTC, or just a pure UNIX time stamp based on the computers current time (or something else unknown to me).

There are two potential problem points if UTC is involved in calculating a timestamp - the system’s time zone must be set correctly so UTC can be calculated correctly, and UTC’s leap seconds may or may not be counted.

And there is one potential problem point if a UNIX timestamp is made according to the system timestamp, and that is just that the system’s time zone must be correct.

It’s possible there is even something like this going on when time() is called: system time is converted to UTC based on the system’s time zone and then the output is given according to php.ini’s timezone setting (which could be set differently from the system’s time zone).

I am just trying to find all of the ways time() might be influenced so I can account for them. I don’t fully understand what will happen if I switch servers or work on time-sensitive code because I don’t understand the mechanics of time().

time() and microtime() always return a timestamp that is timezone independent. Always.
date() takes leap seconds into account when doing the conversion to a human readable format. The date function defaults to the current timezone when it gives its output, but you can override the default during the call.

If you need something more accurate than what these functions provide you’d be well served to use a more reliable programming language than PHP.

I’ve tested this and found it to be the case when setting the timezone setting in php.ini: if I change the timezone, the output of time() is the same. But even though time()'s output may be timezone independent - and this is where I am murky - the current time in the current timezone must be determined before PHP can perform the proper offset to convert the current system time into either UTC (I am speculating) or something else. If this is what happens, then the output of time() is a timestamp representing the current UTC time (calculated by the current system time + timezone). So far, so good (unless your system’s clock/time zone is wrong). But the next issue is whether the timestamp given by time() just is UTC, leap seconds and all, or if it is unix time, without leap seconds factored in.

I am just speculating with the above, so anyone please feel free to correct or update me if I am in error.

I didnt know date took into account leap seconds. I will look into that.

PHP does no such conversion. As I keep trying to tell you and you keep ignoring, on a .nix system the system time is a unix timestamp. Again, the OS itself does not store dates and times according to a timezone setting - it uses a unix timestamp, that’s why it’s called a UNIX timestamp in the first place. Conversion to the local timezone occurs at display time by whatever program is doing the displaying.

Time, like most low level PHP functions, is only a thin wrapper to the OS service that provides the data.

I am not certainly not ignoring what you say, but your posts leave open questions I am trying to address or I am not understanding. You may be assuming I am working only on unix derivatives, when I indicated I want to understand how time() works on Windows as well. I get it that unix time is just unix time on a unix machine, and that unix time just is the system time on a unix machine. (What it is calibrated to is a question I am still looking into.)

But how is the unix time stamp used by php initially determined on windows? Is it using a base system time (without a timezone, similar to a unix time stamp) or is it using the current system time (with a timezone) and doing the calibration based on that internally?

Windows also ultimately stores the timestamps by unix timestamp

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.