I have written a script to synchronise a given table within two or more databases. The script works fine but performance is a huge issue when the given table contains a large amount of data.
The script works by first gathering all the data from all tables and merging into a large PHP array. It then steps through the array and takes all the unique data out into a separate array. Finally, for each database, each row in the unique data array is inserted.
This last step is what takes the most time. Potentially 4000+ individual insert statements is obviously not the most efficient method of inserting data. I'm using PostgreSQL and for this amount of data, normally you would hop on the console and use a COPY command, however this isn't an option as I want this to be PHP only solution.
Any thoughts on improving the speed in this situation?