Hi! I have a URL with an XML file. How can I copy the whole code to a file with PHP? I am using SimpleXML but I haven’t understood how to do this yet.
First download and save the file to your computer using cURL:
set_time_limit(0);
$fp = fopen ('./a.xml', 'w+');
$ch = curl_init('http://www.yoursite.com/yourxml.xml');// or any url you can pass which gives you the xml file
curl_setopt($ch, CURLOPT_TIMEOUT, 50);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_exec($ch);
curl_close($ch);
fclose($fp);
Now you can parse the XML using SimpleXML library.
Thank you for your help! It works perfectly if I only use one url, but I still have some problems trying to make it work with multiple urls:
set_time_limit(0);
$fp = fopen ($path, 'w+');
for ($i=0;$i<count($array_url);$i++) {
$url = 'http://whatever' . $array_url[$i] . '.xml';
$xml = simplexml_load_file($url);
$ch = curl_init($url);// or any url you can pass which gives you the xml file
curl_setopt($ch, CURLOPT_TIMEOUT, 50);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_exec($ch);
}
curl_close($ch);
fclose($fp);
It doesn’t write everything to the file specified by $path.
look into $mh = curl_multi_init();
just google curl_multi_init for more info.
I also dont see why you cannot make use of what you already have. Simply run the furst curl let it do what it needs to, close the connection and run another curl to a diff url.
Yes, I just thought that maybe there was a way to avoid closing and opening many connections, if I can gain some efficiency.
file_get_contents()
file_put_contents()
are for you
Do they work also with urls? I thought they only worked with files on the filesysten…
If allow_url_fopen is set to 1 in your ini file, you can use URL’s with the file functions. (the default setting for this IS 1, so unless your host is picky about it, it should be safe to assume it does work.)
This setting may not be altered at runtime. (IE: You cant use ini_set to override this setting.)
Yes they work even with URls if allow_url_fopen is set to 1. But personally I would not recommend it because if the downloading file is little bigger then with these functions there is high chances to get terminated the script before downloading the whole file. I am saying this from my experience. But with cURL I am successfully downloading the files at least upto 2/3 GB so far.
Thank you very much guys