Well, if you want to store so much data from frequent user interactions then you will increase the load on something, anyway. An insert to the db will always be faster than sending the data by curl to a remote server - at least you can be sure (if all is working properly) that your script will finish fast whereas if there is network congestion your simultaneous curl requests can stack up resulting in many concurrent php scripts active and this way you are at risk of exceeding some server limits and in the worst case even bringing your site down when you reach the maximum allowed number of concurrent php requests.
With the db you have some options of tuning it so as to minimize the overhead of your inserts (like using myisam, tuning innodb for faster but less safe inserts or using memory engine). Or simply save your data to a file in any format you choose (for example, each insert being a separate line of json-encoded data) - this should perform very well.
With these methods you will not experience any problems if the remote server doesn't respond temporarily for any reason - the cron job will try sending the data again in a few minutes and that's it.