How to measure time with microsecond precision on a 32-bit system?

I have a simple way to measure the execution time of some php code, it’s simply getting the values of microtime(float) and subtracting one from the other. For example:


$st = microtime(true);

for ($i=0; $i<10000; $i++) {
	$a = $i;
}

$dur = microtime(true) - $st;
echo $dur;

However, on a 32-bit Windows system the result is very inaccurate and every time I run this script I get either 0.015625 or 0, nothing in between. I suppose this is due to floats having little precision on 32-bit systems. On a 64-bit Linux production server I get values like 0.0025119781494141.

How do I measure duration accurately? I’m trying to find a simplest solution, so far the best I can think of is use microtime() without true, split and concatenate the string and use bc functions for calculation. Are there any better ways?

Is this for RYO benchmarking? If so there may be a package out there that has everything already figured out.
eg. (old) http://pear.php.net/package/Benchmark

An external PEAR library is too much for my needs, occasionally I just want to inject 2-3 lines of simple code to benchmark a piece of code without any external dependencies. But I looked at the source of this package and it uses the same method of counting time as I was contemplating earlier - textual microtime() result + bcsub() to calculate the difference. So I tried this method in my code:


function countDuration($start, $end) {
	$segm = explode(' ', $start);
	$timeStart = $segm[1] . substr($segm[0], 1);
	
	$segm = explode(' ', $end);
	$timeEnd = $segm[1] . substr($segm[0], 1);
	
	return bcsub($timeEnd, $timeStart, 6);
}


$start = microtime();

for ($i=0; $i<10000; $i++) {
	$a = $i;
}

$dur = countDuration($start, microtime());
echo $dur;

And guess what? The same problem persists on the 32-bit Windows system - I get either 0.015625 or 0. I’m inclined to think that floats are not the problem but more likely microtime() is reporting time with very small precision in the first place. What can I do then?

I have had a look back and I did some speed test a few years ago using XAMPP on probably a 32 bit XP setup - I do not think I had vista at that time but if it was it would also have been 32bit and the results were:
1st run
0.299100001653 average time
2nd run
0.298720002174 average time
3rd run
0.298156666756 average time
4th run
0.298293328285 average time
5th run
0.298613333702 average time

Which are not " 0.015625 or 0"

I can not try it now as I am using 64bit Windows 7.

I have had a look back and the code I used was:


for ( $i=0; $i<=9; $i++ ){

      $mtime = microtime(); 
      $mtime = explode(' ', $mtime); 
      $mtime = $mtime[1] + $mtime[0]; 
      $starttime = $mtime; 

exec("convert $image -resize 100x100 command.jpg");

 $mtime = microtime(); 
      $mtime = explode(" ", $mtime); 
      $mtime = $mtime[1] + $mtime[0]; 
      $endtime = $mtime; 
      $totaltime = ($endtime - $starttime); 

// Record the time
 write_results( $save, "$totaltime\\r\
" );
 
 };

I’ve never worried much about such small amounts of time and so never looked much into it.
Interesting stuff.

It does indeed seem to be related to how time is handled. eg. this on NTP (Network Time Protocol) http://php.net/microtime

All these timing scripts rely on microtime which relies on gettimebyday(2)

This can be inaccurate on servers that run ntp to syncronise the servers
time.

And there’s SNTP (Simple NTP), And Windows uses it’s own version http://en.wikipedia.org/wiki/Network_Time_Protocol

Windows Time service

All Microsoft Windows versions since Windows 2000 and Windows XP include the Windows Time service (“W32Time”),[13] which has the ability to sync the computer clock to an NTP server. The version in Windows 2000 and Windows XP only implements Simple NTP, and violates several aspects of the NTP version 3 standard.[14] Beginning with Windows Server 2003 and Windows Vista, a compliant implementation of full NTP is included.[

EDIT
When I did some code with microtime() a couples back there was more processing involved between “start” and “end”.
Rubble’s “exec” would also be more resource hungry than a simple for loop.

Maybe there’s no way to test “fast” code accurately?

Rubble, your case is different because you are measuring image manipulation which takes almost 1/3 of a second so it’s obvious you wouldn’t get 0.015625 or 0 because your conversion lasts many times longer than either of these values. I’m measuring a for loop that gets executed much quicker and it looks like the duration is around the borderline of microtime accuracy.

Anyway, I’ve found this bug: Bug #64633 :: microtime regression - resolution reduced to 64 ticks per second. Due to some changes in PHP code the accuracy of microtime was reduced to 1/64 of a second, which is exactly 0.015625. So it looks like there’s no way around it on Windows systems, at least on Windows 7. They don’t mention anything about 32 vs 64-bit versions so I’m not sure if this makes any difference. They also mention there are different API calls on Windows 8 but it’s also uncertain if the problem is resolved on Win 8.

Good find. Seems 0.015 isn’t very precise for something one would think was. For “humans” not noticeable, but for processors it sure is.
What is misleading is why offer float precision when the data it’s working with isn’t precise. Gads!

It seems true on Windows systems. Being able to time such small amounts of time is not critical but sometimes it’s useful - for example, you can take some piece of code outside of a large loop and benchmark it without any loss of precision.

But this limitation has its side effects - for example, uniqid() suffers from the same lack of accuracy if not used with the more entropy option - and on my system I was able to get 313 same uniqid() values in a loop!

Anyway, I doubt it’s impossible to get accurate microsecond time on Windows - I’m sure any OS needs way more accuracy than 1/64s. for its proper functioning! How many internal commands can be executed within 1/64s.? I suppose quite a lot on modern machines…

Couldn’t you use xDebug to profile your code? Create a very simple test.php that does what you want, profile it and look at the call hierarchy/stack?

I suppose that is an option but here I wanted a most simple method without any dependencies on xDebug and other extensions and I wanted this to be portable across different remote installations where often I don’t have xDebug.

Apart from that, doesn’t xDebug skew the results, I mean if you turn profiling on then some additional execution must get in the way of normal script flow to take the measurements. I once had a problem of some piece of php code running slow because xDebug extension was enabled (unfortunately, I don’t remember what this was about).

That’s fair enough. And a very good reason to rule it out.

It shouldn’t really skew the results, as the injection it does, should be before and after, not during the execution. You would likely see a decrease in your execution from a end-user standpoint because it is collecting the additional data (if you were to leave it enabled). That is why I use extensions that enable me to quickly turn it on or off, so I’m not impacted by it unless I know I want it. Nonetheless, it shouldn’t affect the actual time it takes to execute the code block (but again, I’ll usually run the same code block a few times – 10 for example, and take the average; after removing the first and last execution).

As for the problem at hand. I now wish I had a 32 bit machine to try out a few tests… Unfortunately, I only have 64 bit systems… I’ll have to see if I get the time to build a 32 bit virtual machine, but it may be a while. Sorry.

Thanks, I haven’t really used xdebug profiling apart from testing it once or twice so I don’t have much experience with it. Can you set it up so that only lines from A to B are profiled in a given script?

But if you have a windows 64 machine it would be also interesting to know how it behaves since from what I have found it’s not certain whether the bitness has anything to do with the problem. If you could run the simple code from my first post here and see what accuracy you get?

The code from the OP using Win7

0.0104820728302
0.0029540061950684
0.0029098987579346
0.002979040145874
0.0029270648956299
0.0029220581054688
0.002918004989624
0.002997875213623
0.002924919128418
0.0029458999633789

Pretty good results, thanks. Is this Win7 64-bit?

Edit: Is your php + apache installation 64-bit as well?

The OS is 64 bit, but both PHP(5.3.10) and Apache(2.2.22) are 32 bit

Oh, so I’m afraid we have another variable - your PHP version is pretty old and according to the bug report the problem began after bug #64370 was fixed, which happened at version 5.3.24 in the 5.3.x series. My PHP version is 5.3.26 so you are most probably still immune to this problem. So we still don’t know whether 64-bit OS has anything to do with it…

Not that I know of. I’ve only used it to test the full execution of a process though, as I then go back and review it for out of place items.

  1. High method call times (a method taking over X milliseconds/seconds to run)
  2. High method executions (a method being executed 100+ times)
  3. Several methods being called repeatedly (is there a way to refactor or pass along a result instead of calling it each time it is needed)

Once I get through that list, I then get nit-picky and start to look at method execution sorted by time taken, and I’ll look at the top X percent and see if they can be improved any.

Yeah, I can do that. I’ll run it from work too tomorrow, so I get two different sets (as my work setup is far different than my home setup).

o.O interesting… I’ll definitely post my PHP version and I’ll take apache out of the mix by running it through PHP directly.

First Post Duration is your original code in your initial post, the second post duration is the code using bcsub

PHP Version: 5.4.4-14+deb7u7

Through Apache 2.2.22:

First Post Duration: 0.0034639835357666
First Post Duration: 0.0045969486236572
First Post Duration: 0.0042259693145752
First Post Duration: 0.0035078525543213
First Post Duration: 0.0015749931335449
First Post Duration: 0.0019831657409668
First Post Duration: 0.0016257762908936
First Post Duration: 0.0016179084777832
First Post Duration: 0.0016410350799561
First Post Duration: 0.0016181468963623
Second Post Duration: 0.001704
Second Post Duration: 0.001578
Second Post Duration: 0.001615
Second Post Duration: 0.001597
Second Post Duration: 0.001598
Second Post Duration: 0.001599
Second Post Duration: 0.001571
Second Post Duration: 0.001477
Second Post Duration: 0.001992
Second Post Duration: 0.001521

Through PHP directly

First Post Duration: 0.0038580894470215
First Post Duration: 0.0039410591125488
First Post Duration: 0.0042870044708252
First Post Duration: 0.004223108291626
First Post Duration: 0.0042579174041748
First Post Duration: 0.0025999546051025
First Post Duration: 0.0015888214111328
First Post Duration: 0.0016119480133057
First Post Duration: 0.0015580654144287
First Post Duration: 0.0016000270843506
Second Post Duration: 0.001609
Second Post Duration: 0.003847
Second Post Duration: 0.003861
Second Post Duration: 0.004273
Second Post Duration: 0.003091
Second Post Duration: 0.001775
Second Post Duration: 0.001759
Second Post Duration: 0.001782
Second Post Duration: 0.001933
Second Post Duration: 0.001934

I switched to 5.3.26 and got mostly 0

0.015600919723511
0
.....
0.015599966049194
0
.....
0.01560115814209
0
......

EDIT
similar with 5.3.28

0.015599966049194
0
.....
0.015600204467773
0
.....
0.015599966049194
0
.....

EDIT
version 5.4.23 seems even worse, a lot more 0 and when not, the exact same amount

0
.....
0.015599966049194
0
.....
0.015599966049194
0
.....
0.015599966049194
0
.....