SitePoint Sponsor

User Tag List

Results 1 to 3 of 3
  1. #1
    SitePoint Zealot
    Join Date
    Sep 2005
    Posts
    122
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    Database Synchronisation

    I have written a script to synchronise a given table within two or more databases. The script works fine but performance is a huge issue when the given table contains a large amount of data.

    The script works by first gathering all the data from all tables and merging into a large PHP array. It then steps through the array and takes all the unique data out into a separate array. Finally, for each database, each row in the unique data array is inserted.

    This last step is what takes the most time. Potentially 4000+ individual insert statements is obviously not the most efficient method of inserting data. I'm using PostgreSQL and for this amount of data, normally you would hop on the console and use a COPY command, however this isn't an option as I want this to be PHP only solution.

    Any thoughts on improving the speed in this situation?

  2. #2
    SitePoint Wizard silver trophy kyberfabrikken's Avatar
    Join Date
    Jun 2004
    Location
    Copenhagen, Denmark
    Posts
    6,157
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    I'm not sure if it's valid syntax in Postgres, but with MySql you can insert multiple rows in one INSERT statement, using the following syntax:
    Code:
    INSERT INTO tablename VALUES (col-a1, col-a2, col-a3) (col-b1, col-b2, col-b3);

  3. #3
    SitePoint Zealot
    Join Date
    Sep 2005
    Posts
    122
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    That feature has just been added to the latest version of PostgreSQL. Time for an update!

    Thanks mate, that should help immensely.


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •