Hi,
I am currently working on a website that will solely rely on XML files for its content (display only) - listings,filtering etc. Each XML file is equal to one record. These files are then compressed to a single zip file per day from another FTP server which I download them from. So this means there are thousands of XML files currently generated.
I just would like to ask for the best way to handle the data. Is it faster to create a script to import them to a database first? Then check daily for a new zip file and insert it into the database. Or just directly get data through the XML files?
On both methods I would think that it will require me to extract all the zip files first?
You input is very much appreciated.
Thanks!