I currently maintain 6 websites for clients that run small business. The websites use the same backend code. I'm looking for a way to gather information from these sites remotely and pass the information back to me so I don't have to manually go to each site to get the information. I can write a php script and place it in each of these sites that will gather the information and serialize it but I'm looking for a nice way to call these scripts from my site and log the data.
For example; I could have the script reside on http://clients-web-site.com/script/getinfo.php. Would it be best to use cURL to call that script from my site and then parse the data on my end?
Are there any other ways that would be better?
My vote goes with cURL. Have your remote data connection password protected and then use cURL locally and it will log in and retrieve the data.
+1 for cURL and password protection. I would also consider JSON for the data transportation. Additionally, you could compress the JSON with GZ or BZ to save bandwidth.
+1 for cURL
I'd also put a .htaccess in the directory the file is in allowing access to your IP only.
deny from all
allow from my.ip.add.res
Thanks for the tips. One more question. Can I pass the password (encrypted of course) to the remote script through cURL?
Yes, it can be done in the following manner:
$curl = curl_init();
curl_setopt($curl, CURLOPT_HTTPAUTH, CURLAUTH_BASIC ) ;
curl_setopt($curl, CURLOPT_USERPWD, "username:password");