Good afternoon from unseasonably sunny Palo Alto, California; where I am stuck on a website PHP programming problem. My site shows, in a table, a series of records from a MySQL database table that I also own. Each record in the database table has two fields besides the requisite ID. So the website shows two fields per row, that is, two fields per record.

My database gets its data from about ten different public domain websites, each of which displays information in a manner similar to mine, but each a bit uniquely. My site effectively aggregates the records of these ten other sites. Each of the ten sites adds a few records each day. Once a record is added to one of ten, it's valid in perpetuity; that is, the data never gets stale. None of the ten websites seems to have an RSS feed or API that would enable me to update my database automatedly. So I've had to do it manually , cutting and pasting information from the ten other websites into my database. Not fun.

Please, how should I think about using PHP to streamline this process? I have done a lot or research on CURL, screen scraping, and other methods, but none seem to ring true. A basic cure would be to find a way to automatedly dump all the records from each of the ten into my database. An advanced cure would be to do that, plus to cause some sort of continuous updating. If anyone cold point me in the right direction, I would most certainly appreciate it. Thank you!