Hundreds of Curl calls as part of same function?

I built an API that will send data to Twilio from a Wordpress site. The purpose is to send SMS to registered users. The API relies on Twilio Studio, which only accepts a single phone number/recipient per request.

So, that would mean for it to work, I would have to foreach iterate through an array of users (possibly hundreds/thousands?). Is this going to cause any issues making that many individual curl calls at once?

Well that is certainly lame, but I see why it would be implemented like that. But sending multiple curl requests shouldn’t be too much of a problem as long as the server can handle the connections. I would also suggest perhaps looking at curl_multi_init and its related curl_multi_exec function to queue up and run multiple curl handles.

Might be worth taking a look, but I would consider batching the calls up and handling them as batches rather than just running them in a loop for thousands of iterations.

Well, to be fair, Twilio Flow was designed to be used with a single request at a time, not to spam a bunch of messages to multiple people (though, in this use case they would have opted in to a newsletter).

It is possible to send the messages via Twilio without Twilio Studio, but it apparently requires a Twilio-specific sdk Composer package, which users won’t be able to install if they’re using shared hosting.

I would recommend off loading that processing to another server other than the same one your application is running on. Ideally using queuing solution that provides more fault tolerance, resilience, parallel scalability, recovery than making one large request in one single thread. Processing messages in single or smaller batches will reduce load and make it easier to troubleshoot the cause of failures.

I used curl_multiexec(…) to find the http_response codes on at least fifty URLs and was amazed at the virtually instant results… apart from invalid URLs which each took about five seconds to resolve.