The problem looks to be that you're downloading the remote content serially, i.e. one at a time. You have to wait for the first to have finished before starting on the second, and so on. The key here is to make the requests in parallel: all of them (or in chunks) at the same time. That way the total time taken is (optimally) only the time of the single slowest request.
This can be done in various ways. You could spawn many instances of your script at the same time, each fetching from one URL only. There is also cURL's "multi" interface, which allows sending off and receiving many cURL requests simultaneously. It's a bit of a faff, but a good starting point is the curl_multi_exec() PHP manual page.
If you're going to be making ~150 requests pretty much simultaneously, you had better make sure that the content provider is really OK with it. That said, there's no reason why you can't artificially "slow" the requests (only send N requests at once, for example) and still be much faster than getting the URLs one at a time.