Is it Possible to Make this "Get Image" Code Any Faster?

I’m writing some script that gets a few thousand images from google. I have all the URL’s stored in an array. Here’s my code that gets them all and puts them into a “mosaic”:


function get_image($url) {
    $ch = curl_init();
    curl_setopt ($ch, CURLOPT_URL, $url);
    curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
    curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, 0);
    $fileContents = curl_exec($ch);
    curl_close($ch);
    return imagecreatefromstring($fileContents);
}
$image_number = 0;
	for($row = 0; $row < $rows; $row ++){	
		for($col = 0; $col < $columns; $col ++){		
			$copy_image = get_image($url_array[$image_number]);
			if(!$copy_image){
				$copy_image = get_image($url_array[$image_number - 1]);
			}
			$copy_image_width = imagesx($copy_image);
			$copy_image_height = imagesy($copy_image);
			if($copy_image_width > $copy_image_height){
				$copy_image_width = $copy_image_height;
			}else{
				$copy_image_height = $copy_image_width;
			}
			imagecopyresampled($final_image, $copy_image, $size * $col, $size * $row, 0, 0, $size, $size, $copy_image_width, $copy_image_height);
			$image_number ++;
		}
	}

Basically what this code does is goes through my URL array, and referring to the function “get_image()” which uses cURL to get the image from google, then crops/resizes the images to match the others, and pastes it onto my “canvas”, $copy_image.

Getting a few thousand images takes quite a while, and most of the script’s time is used during the “get_image()” function calls.

Is there a way I can speed this process up?

Best way to speed it up would be to copy the images down to your server once, and use them to build the mosaic each time, rather than spending so much time going to an external site. Unless your URL array changes -completely- every time you run this script, it would speed things up considerably to access local image files.

If you want to make it faster, you’d have to utilise threading. Threading in PHP is a dodgy area, and I’ve yet to see a good version of it.

What I’d really recommend is doing this task in a language which allows threading (C++, C# and Java are the three that come to my mind) to download the pages which you need simultaneously, then do the image resizing simultaniously and save them all to a folder when done.

You could call this application from the command line via PHP.

I don’t think Google would be too happy about you using their services inside your application, so check out their T&Cs.

Thanks for the reply! While googling a bit about threading after your reply, I found the PHP curl_multi functions.

Do you know if these would have the same speed as threading you’re talking about?

And, my url array does change completely each time =/