SitePoint Sponsor

User Tag List

Results 1 to 6 of 6
  1. #1
    SitePoint Wizard frank1's Avatar
    Join Date
    Oct 2005
    Posts
    1,392
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    suggest us the way out:Content syndication

    well
    we have some forex data which are updated every day.
    We are ranking very well for it in google...now .We want other sites to be able to include our forex data in their website as well(along with link to us)

    Forex data is stored in mysql database and fetched using php in tabular form

    The easist way we can think of is make other site include our forex_other.php file which include forex data only in tabular from...
    <?php include('http://site.com/file.php') ?>

    But this way problems are.. (using includes)
    • many host doesnt allow it...or disable such functions
    • we may run out of bandwidth...
    • the users site may be some what slow while fetching data(opening homepage,if there is external include..)
    • if our site is down the user may not get data
    • and user may not be convinced about security problem while including files...


    so...considering these problems ...are there still way out...or recommended/safe way doing thing using this way...
    publisher wise it is easist so would be great to follow it if there were not problems...

    plan B

    if that are problems then ...(or it cannot be solved that way..)
    we have thought of rss...
    but problems with rss are
    our rss need to be peculiar..in the sense that ...
    we have
    date
    countryname rates
    country1 rate1
    country2 rate2
    ....
    link

    so we are planning to make rss this way

    PHP Code:
    while ($row mysql_fetch_array ($resultMYSQL_ASSOC)) {

     
    $xml_output .= "\t<forex>\n"
           
    $xml_output .="<countryname>$row[country_name]}</countryname>";
              
            
    $xml_output .="<rate>{$row[rate]}</rate>";
      
    $xml_output .= "\t</forex>\n"
      } 
    and so on
    so will it validate as normal rss?...are there any problems doing this way..not using standard ...title. link....description way...?(off course with all <?xml header and all..)

    our plan is to write xml file(rss) and use script automatically to upload rss feed to some thirdparty server..
    so that
    • there is not any load in our server+
    • if our server is down ,it is still accessable...

    (and keep our brand and link there..)
    so are there any problems doing this way?
    can we use script to automatically upload rss file to third part server...?
    (or what can be other way...)

    providing xml format seems be much vunerable to screen scraping..using ...curl and other php functions...how can we address that problem?

    and

    do we need custom rss reader?...or do we need to write custom rss reader to read our feed...?(with countryname...rate..and tables..)

    please provide us a way...and suggestion...

    thanks

  2. #2
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    If the data updates once a day, then you write it to a file once a day and let them just access the file via url. Whether you serve it off your own webserver, or offload the hosting somewhere else is your choice.

    Give the user some code which can be used to fetch this external resource. Make the code try to cache the data locally on thier own server so they have fast access to the data. use curl, or any of php's many filesystem functions to retrieve the file from the remote server. Include should never be used on a url, it's just silly.

    You cannot stop them from removing your branding. Nothing will work. Not even using javascript to fetch the data will stop them, although you could make it a real pain in the ****.

  3. #3
    SitePoint Wizard frank1's Avatar
    Join Date
    Oct 2005
    Posts
    1,392
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by crmalibu View Post
    If the data updates once a day, then you write it to a file once a day and let them just access the file via url. Whether you serve it off your own webserver, or offload the hosting somewhere else is your choice.

    Give the user some code which can be used to fetch this external resource. Make the code try to cache the data locally on thier own server so they have fast access to the data. use curl, or any of php's many filesystem functions to retrieve the file from the remote server. Include should never be used on a url, it's just silly.

    You cannot stop them from removing your branding. Nothing will work. Not even using javascript to fetch the data will stop them, although you could make it a real pain in the ****.
    ok if we have to use curl and all ...we may not require even xml or rss
    but what about problem we have said...
    even hosting in our server...whats about bandwidth and availability?
    and whats about other problem we have pointed there using curl system...

  4. #4
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    I don't know the capability of your server, and whether financially you should offload the hosting of this data resource externally or not. That's your decision. Webservers are extremely efficient at serving static files from disk, which is what you want to do.

    If you want a free way to offload it, maybe use google. http://code.google.com/appengine/. You could write a python script to upload to google, which will accept an http post from your server, and update the data it will serve to people. Have your server send this post request once a day to update.

    You need to write some code that you can give to people. This code try's to fetch the external resource, exausting all methods. If it can't, well, then they can't You could offer a javascript backup version, although this will be slow for thier visitors.

  5. #5
    SitePoint Guru
    Join Date
    Dec 2005
    Posts
    982
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    You could probably use amazon s3 for this, or some other type of content distribution network. The real key to performance here is to cache the forex data as long as possible. If it changes every day via a cron job, perhaps you can re-write that cron job to delete yesterday's cache, create a new cache and upload it the storage of your choice.
    MySQL v5.1.58
    PHP v5.3.6

  6. #6
    SitePoint Wizard frank1's Avatar
    Join Date
    Oct 2005
    Posts
    1,392
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by crmalibu View Post
    I don't know the capability of your server, and whether financially you should offload the hosting of this data resource externally or not. That's your decision. Webservers are extremely efficient at serving static files from disk, which is what you want to do.

    If you want a free way to offload it, maybe use google. http://code.google.com/appengine/. You could write a python script to upload to google, which will accept an http post from your server, and update the data it will serve to people. Have your server send this post request once a day to update.

    You need to write some code that you can give to people. This code try's to fetch the external resource, exausting all methods. If it can't, well, then they can't You could offer a javascript backup version, although this will be slow for thier visitors.
    thanks
    ok can't be uploading to google or any other server done using php...well we dont know much about pyhton....and does linux server deafault config support pyhton(or is it available?)

    so best idea for now seems to be use php file itself....
    write it in our server
    make a curl script to allow user to get that file,and instruct people to copy paste that code who ever wants to use it
    use cache so that the content is cached in their server so that our servered is not queried every time it is opened in their site...

    ok problems i see
    first we are encouraging people to get contents off our site using curl...so people can really manipulate this script....

    ok is writing xml file ,may be in rss solution to these all?
    that way may be we need to write rss reader as well...
    and our rss will be peculiar in structure so,will that make any difference...any where...

    thanks


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •