Problem with SimpleXML - Advice please :-)

Hi Everyone,

Ive just got started with simpleXML and was finding everything nice and easy - I started to learn it so that I would be able to take betting data from a william Hill feed and store it in my database - So i copied the xml file to my local server and wrote the script to run through it which was all working fine as well. I then thought it would be better if i opened the file over the net. Now with exactly the same script - get an error ive tried to explain the code below - if anyone can advise me where ive gone wrong id be grateful been searching the net for hours as to why it would be different when trying to open a file locally compared to one over the net.

This is my code to open up the file.

This is the local version

$xml = simplexml_load_file(“oxipubserver.xml”);

echo $xml->getName() . “<br />”;

$willhill[‘version’] = $xml[‘version’];
$willhill[‘created’] = $xml[‘created’];
$willhill[‘requestTime’] = $xml[‘requestTime’];

//Get markets from feed
$t_count=0;

foreach($xml->response->class->type as $type){

//Down here then im parsing through the file.

The only difference I made was then to change the file loaded to

$xml = simplexml_load_file(“http://pricefeeds.williamhill.com/oxipubserver?action=template&class=|UK%20Football|”);

Now instead of getting the full xml file i only get the top bit of this:

SimpleXMLElement Object ( [@attributes] => Array ( [version] => 7.1 [created] => 2010-03-01 14:16:24 [lastMsgId] => 244516226 [requestTime] => 0.0029 ) [response] => SimpleXMLElement Object ( [@attributes] => Array ( [request] => getHierarchyByClass

 =&gt; 001 [message] =&gt; success [debug] =&gt; [provider] =&gt; williamhill ) [disclaimer] =&gt; SimpleXMLElement Object ( ) ) )

I checked and switched back to the local file and i get the full file - all 2mb of this. I cant think why this would be the case and am now nearly bald - i have no more hair left to tear out. If someone could just point me in the right direction. 

Thanks

Maybe the remote file has changed?

Try file_get_contents() to get as string, then simple_xml_load_string. You can save the string to a local file and then repeat the test via load_file

Thanks for the advice - I gave it a go that way and it all worked fine so i know its only a problem when im using simplexml_load_file and loading the address pricefeeds.williamhill.com/oxipubserver?action=template&class=|UK%20Football| - if i do as you advise and get all the contents it works or if i save the file to my server it works. I guess i can use this as a workaround but would it add much time to the script execution?

Also has any one else ever run into this sort of problem before the only thing i can think is that as the web address doesn’t have an exstension e.g. xml its not treating it like a file properly.

Sorry to pester people but while this problem has been solved now I was curious if anyone could explain why this wouldn’t work in the first place as nothing has really changed other than that i now treat the document as a string rather than loading a file. ITs bugging the hell out of me as I cant stand not knowing why something doesn’t work :slight_smile:

When I request http://pricefeeds.williamhill.com/oxipubserver?action=template&class=|UK%20Football| via browser or php, I get:

<?xml version="1.0" encoding="UTF-8" ?>
<!--OXi dbPublish-->
<oxip version="7.1" created="2010-03-02 17:10:59" lastMsgId="" requestTime="0.0030">
  <response request="getHierarchyByClass" code="001" message="success" debug="" provider="williamhill">
    <disclaimer/>
  </response>
</oxip>

Maybe the feed is Geo aware…

This happened to me too with simplexml_load_file() - but I knew straight off why I thought this was the case - because I have the php ini file setting;


;;;;;;;;;;;;;;;;;;
; Fopen wrappers ;
;;;;;;;;;;;;;;;;;;

; Whether to allow the treatment of URLs (like http:// or ftp://) as files.
allow_url_fopen = On

On my dev server and Off on my live server, AND I run in safe mode.

So, if allow_url_fopen was OFF you wouldn’t be able to get the file with file_get_contents() so maybe there is some other kind of granular setting in safe mode which is affecting your call to simplexml_load_file().

As you can gather, I am not exactly clarifying matters for you, but these extra variables (which may or may not have any bearing on the matter in hand) should keep you worrying now for much longer than you had anticipated.

HTH

And the punchline I should have added was:

If your application is dependent upon external data, and you are fetching that external data over x networks - then I’d advise you do the operation in 2 distinct passes.

  1. get the data and cache it locally (I just use cURL).
  2. do what you need to do with the data, but from your local cache

So many things can go wrong with 1) what decision do you want 2) to make if the data is not fresh?

The feeds appear to not like being requested with the class name (|UK Football|) but using the class id numbers (1 for UK Football) seems OK for me at least. Maybe that’s the problem for you.

Since these feeds take a while to be generated, as Cups said it would be worth your while retrieving them every so often in the background and saving a copy locally then using the copy whenever you need it.

Thanks everybody. Sorry for delay replying been away. Some of these are excellent suggestions - Geolocation had never occurred to me but it should have as I had tried some similar with Betfair and as my server is based in U.S it wouldn’t work due to betfair restricting access from there.

To - crmalibu - Thanks for the checking as well - that was the same message i was getting and then after writing the message was getting it constantly but when ive just checked it now it was working fine.

To - Cups And Salathe - Thanks for the advice - Ive been trying a couple of other ways to get the feed and on the william hill site they used the numeric class names rather and the names and so far that has worked a lot better - I think I will have to store locally and then go from there - That doesn’t bother me so much it was more the unanswered question - now i know it doesn’t work for other people as well I can get on with it and hopefully start to rebuild my life :slight_smile: Thanks for all the help everyone. Hopefully one day ill be able to return the favour - Watch out for those flying pigs. lol.