file_get_contents vs fread

What are cons and pros of file_get_contents vs. code below to read a local file?

$fd = fopen("....", "r");
$data = "";
while (!feof($fd)) {
       $data .= fread($fd, 1024);
}
fclose($fd);

file_get_contents — Reads entire file into a string. This is the preferred way to read the contents of a file into a string. It will use memory mapping techniques if supported by your OS to enhance performance. whilst

fopen() — Opens file or URL binds a named resource, specified by file name, to a stream.

instead of while loop isn’t it better to use below?
fread($fd, filesize($local_file_name));

while (!feof($fd)) { $data .= fread($fd, 1024); }

This will read your file in chunks of 1M.

For small files you can use the example from the manual
http://php.net/fread (as you said)

$filename = "/usr/local/something.txt";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
fclose($handle);

If I got correctly from this discussion:

  1. file_get_contents is good ONLY if the file is ascii and less than 1Mb.
  2. fread($fd, filesize($local_file_name)) is good if file is less than 1Mb
  3. fread within while loop is good for files up to 1Mb or more.

Am I correct?

about file_get_contents, it’s binary safe - http://php.net/file_get_contents - you can use it for any kind of content. Quote from the manual:

In my opinion you should use fread only if you are working with streams and you have functionality based on that (the code has fopen of stuff and you already have the resource). In any other case you can use file_get_contents.

About large files, file_get_contents also has offset and maxlen so, you can also use it to read in chunks. Now, if you bump into “large files” issues you need to think twice. PHP was not created to work with large files so, if you can pass this job to an external program or even to the system, that would be great.

LE: if you want to show a file for download (header + show to the browser) you should use readfile (it’s the fastest method to output file contents). But, use it only in this case because it will output the file, will not store into a variable.
To store, you can use ob_start + readfile + ob_get_contents but I think this combination will be slower than file_get_contents

sorry, what is “stream” ?

Well, that’s a complex topic :slight_smile:

You can read here:
http://www.sitepoint.com/understanding-streams-in-php/
http://php.net/manual/en/intro.stream.php

In few words, it’s like a connection trough a port that will remain open and that you can access.

// create a stream server
$server = stream_socket_server("tcp://127.0.0.1:7891", $errno, $errorMessage);
// check if accepted
$conn = stream_socket_accept($server);

and in another file

$client = stream_socket_client("tcp://127.0.0.1:7891", $errno, $errorMessage);
while (!feof($client) ) {
    echo fgets($client, 1024);
}

You can find some examples here: http://php.net/stream_socket_client
and also in other places on the internet

which one is better to write larger files, fwrite or file_put_contents?

file_get_contents should be lighter so, that’s what you need but…

What is your use-case?
What do you need to achieve?
Maybe there is another way.

I am creating attachment extractor, my script connects to mail server, searches the email with attachments, and put the attachment as files. I am using fwrite and it works fine now, but was wondering if file_put_contents isn’t better than fwrite for such purpose if attachment is larger?

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.