A few of my PHP pages can sometimes take up to a few minutes to load. When the web browser is loading one of these pages, it seems to halt loading any other new pages in new tabs. I’m curious why this happens. If I had to hazard a guess it’s because the PHP engine is busy dealing with large page load. Any ideas on how I can make PHP process other pages while the big one is loading? I’m pretty sure I’ve seen kind of behavior work correctly with Webmin. For example, when I’m updating my VPS using the Webmin update system which sometimes takes up to 10 minutes, I’m still able to open new tabs of the Webmin area while the update does it’s thing. Perhaps this is an Apache thing? I dunno, I’m not sure what it is exactly I’m talking about here, but I know it’s an issue and I know I want to solve it!
<?php
// first line
$_script_started = microtime(1);
// ... the entire script
// that may be in different files
$_page_time_seconds = microtime(1) - $_script_started;
if( isset($_GET['debug_mode']) ) {
echo $_page_time_seconds;
}
// and access your page with -- ...?debug_mode
Check firebug, for static resources
a) open the firebug (or press F12 in Firefox)
b) go to the Network tab
c) hit CTRL(or Command on Mac) + F5 (this is a refresh without cache)
d) wait until it loads and check the number of requests and time that page takes to load
(check the bottom-right corner)
Check your PHP and Apache logs, for errors. Run your page with error_reporting(E_ALL) so you can see all errors.
You might look into “profiling”. You can use tools such as XDebug or XHProf. There are also some PHP editors which include profiling features built in. These will help you analyze your code to find any bottlenecks.
Another option is to put long running code in a background process so that it does not delay the current page load.
How many cores and how much memory does your VPS have? How much memory is your script consuming? How much memory do you have available for it? Web server optimization is a field unto itself. I know little about it. As a rule, you want to minimize database queries and file access. File access is very slow, and that includes database access as the data is stored in files. If you do not have enough RAM, your script will have to use the swap file and that can result in thrashing.
There are multiple ways to do this and there are quite a few tutorials on google if you search “php background process”.
But I think you should first determine what is causing it to take so long to load, and to do this I would reference vectorialpx’s post or use one of the debug tools I mention above.
Sometimes you can, sometimes it’s not OK to do so.
However, if you have a problem you must solve it, not hide it
(those background actions will also take a long time and the problem remains).
After you know that your problem is PHP, you can dig into memory checks. For now, the first step is to detect what does it take so long. It can be Apache (or the OS), PHP or some external resource. Again, check your logs, it’s very important to have clean logs.
I’m positive it’s just because I’m using PHP to do something that is better off done with some kind of shell command native to linux. I’m checking 3000 records in a database against the same 3000 records looking for duplicates amongst 2 different fields, barcode and item title, as well as doing an extra check to see if titles are similar using similar_txt(), however PHP is what I know right now, and I’m pretty sure I’ve seen it possible to have a long loading script not hang the viewing of other pages on the website, in the same session in a different tab before. It sounds like this background process thing might be the way to go.
I’m using a Xen vps and my host has informed me that I’m completely entitled to do whatever I want with the system providing I don’t open up a spam mail relay or run a torrent tracker.
I have 1gb of ram, 100gb of harddrive space, 12mb of swap and 2 cores on the system. 10000gb of bandwidth, if you guys were curious.
I think you can try to optimize the query (maybe add some indexes) or create a cron that doesn’t interfere with the website. This is just an idea, now, you know better what you need
Without knowing the full story, it appears as though your 3,000 records could be checked for duplicate titles, etc using database tools and should only take a couple of seconds at the most. Please supply further details of the 3,000 records.
I’m wondering if it would be possible to copy the data over to a new table that has the fields in question as being unique and log the errors when they’re not.
The similar_txt sounds like a resource hog no matter how it’s approached.