SitePoint Sponsor

User Tag List

Results 1 to 5 of 5
  1. #1
    SitePoint Addict
    Join Date
    Apr 2002
    Location
    Miami
    Posts
    214
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    fwrite Maximum File Size

    I have a client that wants me to queery a database and write out a 400,000 record xml file. The xml file has about 10 child elements with maybe 200 characters each element.

    Could I do this with fwrite? Has anyone ever written such large files using php?
    This is not a normal service we offer, I am hesitant to even try. If you have had any experience writing very large files in PHP or have any tricks please let me know.

  2. #2
    Twitter: @AnthonySterling silver trophy AnthonySterling's Avatar
    Join Date
    Apr 2008
    Location
    North-East, UK.
    Posts
    6,111
    Mentioned
    3 Post(s)
    Tagged
    0 Thread(s)
    It shouldn't be a problem, however you may be best off only writing an XML record at a time clearing the variables after each iteration to save memory.

    Or even breaking the SQL result set down into small chunks and working with it.
    @AnthonySterling: I'm a PHP developer, a consultant for oopnorth.com and the organiser of @phpne, a PHP User Group covering the North-East of England.

  3. #3
    SitePoint Addict
    Join Date
    Apr 2002
    Location
    Miami
    Posts
    214
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    EDITED
    Never mind I fount the unset($) function - that works
    -----------
    I tried a couple of different ways but the script dies at around 128 MB file size (180,000 records). I tried iterating through chunks as small as 10 records and as large as 1000 records, in other words my sql would limit the result set to 10-1000 records and I would keep looping through this till the script died.

    The script is not timing out. My first attempts opened and closed the txt file for each chunk and that really slowed things down ... the script would then time out after only 90,000 records.

    Any ideas to get this to work. The client was pressing me so I broke the source file down into 100,000 record chunks and gave him 7 files but that was a pain. Thanks for your ideas.
    Last edited by whiterabbit; Nov 20, 2008 at 09:17.

  4. #4
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    see
    set_time_limit()
    mysql_unbuffered_query()

  5. #5
    SitePoint Addict
    Join Date
    Apr 2002
    Location
    Miami
    Posts
    214
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Funny, started to time out all of a sudden after 30 seconds. This was not happening before.
    I used the set_time_limit() function that you mentioned after each iteration of a chunk to add
    20 seconds more to the script.

    The script processed all records (653243).

    Well I can't spend anymore time on this but I thought Apache had a 300 sec. timeout.
    I called this script from the browser and it ran for 16 min?

    =================================


    so would I just select my 650,000 records from the database
    $ok = mysql_unbuffered_query($sql, $conn);
    if($ok) {
    while ($row = mysql_fetch_array($ok)) {
    ... and not worry about unset($ok)

    Right now I am grabbing 1000 record set chunks. I unset $row (and all other set variables) after each loop of the result set and $ok after each iteration of a chunk.
    Last edited by whiterabbit; Nov 20, 2008 at 13:38.


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •