Resizing 1000 images in a directory at once - would the server be able to cope?


I’ve got 1000 images I need to resize, then make another thumbnail version, add the image name to a database, and I’m also think about producing a desaturated version.

Is this something that the server could cope with, or should I put some kind of limit on the number of files processed.

By the way, I’m using shared hosting.



No problem, I’ll limit them to about 50, shouldn’t take too long to get them all done.

Cheers, the images are jpegs of around 120-140kb. And I’ll be using PHP GD(?) functions.

Can you not resize them locally, ie, not on server?

If you’re intent on have the web server do it, process them a few at a time. Use the absence of the image name in the database as an indicator, whether to process the image or not.

There are a bucket load of things that may affect the servers ability to cope, just try to play nice and add some common sense. :wink:

Probably, but I’ll play safe and say that we need more info.

What are the images being resized (consider 20 megapixel RAW images versus 20 KB JPEGs)? Will this need done be done repeatedly or is it a one-off? What were you hoping to use for the resizing (GD2, Imagemagick, something else)? Can you do the job locally and push the finished files to the shared server? Its details like these that we need to help give you a yay or nay.

If in doubt build a javascript/php loop.

The php divides the task into groups of 50. When it completes a group it sends a response back to the browser and embedded in it is a javascript call to do the next batch. These calls echo back and forth until the task is done.

If you are doing this as a cron you can do better by dividing the task on the first process and then forking off the actual builds to different process. This would be even faster since the OS can put the scripts onto different cores of the server as the task is parallel in nature. The only real bottleneck is disk access.