I’m writing some script that gets a few thousand images from google. I have all the URL’s stored in an array. Here’s my code that gets them all and puts them into a “mosaic”:
function get_image($url) {
$ch = curl_init();
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, 0);
$fileContents = curl_exec($ch);
curl_close($ch);
return imagecreatefromstring($fileContents);
}
$image_number = 0;
for($row = 0; $row < $rows; $row ++){
for($col = 0; $col < $columns; $col ++){
$copy_image = get_image($url_array[$image_number]);
if(!$copy_image){
$copy_image = get_image($url_array[$image_number - 1]);
}
$copy_image_width = imagesx($copy_image);
$copy_image_height = imagesy($copy_image);
if($copy_image_width > $copy_image_height){
$copy_image_width = $copy_image_height;
}else{
$copy_image_height = $copy_image_width;
}
imagecopyresampled($final_image, $copy_image, $size * $col, $size * $row, 0, 0, $size, $size, $copy_image_width, $copy_image_height);
$image_number ++;
}
}
Basically what this code does is goes through my URL array, and referring to the function “get_image()” which uses cURL to get the image from google, then crops/resizes the images to match the others, and pastes it onto my “canvas”, $copy_image.
Getting a few thousand images takes quite a while, and most of the script’s time is used during the “get_image()” function calls.
Is there a way I can speed this process up?