Haven't gotten a response on this, so I came to my own conclusions in case they help someone else.
First, I timed a loop of 1000 records as insert queries against 1000 of the same records into a log file using 'file_put_contents' with the FILE_APPEND option. Adding the records to a log file was MUCH faster (10 seconds or so for the queries, 1 second for the flat file).
I'm combining the flat file with the MySql LOAD DATA INFILE (which is extremely fast) to add the stored data to the database. I'll run this as a cronjob which gives the added benefit of scalability as the load increases (I can run it every minute for now, and increase the time between crons if I need to).
My biggest concern at this point is mal-formed data being run through the LOAD DATA INFILE function which causes it to crash. I'll have to add some error checking and alert email functionality in case something goes wrong.
Any replies or comments on this would still be appreciated.