Understanding Streams in PHP

    Vito Tardia

    Streams are resources provided by PHP that we often use transparently, but which can also be very powerful tools. By learning how to harness their power, we can take our applications to a higher level.

    The PHP manual has a great description of streams:

    Streams were introduced with PHP 4.3.0 as a way of generalizing file, network, data compression, and other operations which share a common set of functions and uses. In its simplest definition, a stream is a resource object which exhibits streamable behavior. That is, it can be read from or written to in a linear fashion, and may be able to fseek() to an arbitrary locations within the stream.

    Every stream has a implementation wrapper which has the additional code necessary to handle the specific protocol or encoding. PHP provides some built-in wrappers and we can easily create and register custom ones. We can even modify or enhance the behavior of wrappers using contexts and filters.

    Stream Basics

    A stream is referenced as <scheme>://<target>. <scheme> is the name of the wrapper, and <target> will vary depending on the wrapper’s syntax.

    The default wrapper is file:// which means we use a stream every time we access the filesystem. We can either write readfile('/path/to/somefile.txt') for example or readfile('file:///path/to/somefile.txt') and obtain the same result. If we instead use readfile('http://google.com/') then we’re telling PHP to use the HTTP stream wrapper.

    As I said before, PHP provides some built-in wrappers, protocols, and filters. To know which wrappers are installed on our machine we can use:


    My installation outputs the following:

        [0] => tcp
        [1] => udp
        [2] => unix
        [3] => udg
        [4] => ssl
        [5] => sslv3
        [6] => sslv2
        [7] => tls
        [0] => https
        [1] => ftps
        [2] => compress.zlib
        [3] => compress.bzip2
        [4] => php
        [5] => file
        [6] => glob
        [7] => data
        [8] => http
        [9] => ftp
        [10] => zip
        [11] => phar
        [0] => zlib.*
        [1] => bzip2.*
        [2] => convert.iconv.*
        [3] => string.rot13
        [4] => string.toupper
        [5] => string.tolower
        [6] => string.strip_tags
        [7] => convert.*
        [8] => consumed
        [9] => dechunk
        [10] => mcrypt.*
        [11] => mdecrypt.*

    A nice set, don’t you think?

    In addition we can write or use third-party streams for Amazon S3, MS Excel, Google Storage, Dropbox and even Twitter.

    The php:// Wrapper

    PHP has its own wrapper to access the language’s I/O streams. There are the basic php://stdin, php://stdout, and php://stderr wrappers that map the default I/O resources, and we have php://input that is a read-only stream with the raw body of a POST request. This is handy when we’re dealing with remote services that put data payloads inside the body of a POST request.

    Let’s do a quick test using cURL:

    curl -d "Hello World" -d "foo=bar&name=John" http://localhost/dev/streams/php_input.php

    The result of a print_r($_POST) in the responding PHP script would be:

        [foo] => bar
        [name] => John

    Notice that the first data pack isn’t accessible from the $_POST array. But if we use readfile('php://input') instead we get:

    Hello World&foo=bar&name=John

    PHP 5.1 introduced the php://memory and php://temp stream wrappers which are used to read and write temporary data. As the names imply, the data is stored respectively in memory or in a temporary file managed by the underlying system.

    There’s also php://filter, a meta-wrapper designed to apply filters when opening a stream with function like readfile() or file_get_contents()/stream_get_contents().

    // Write encoded data
    file_put_contents("php://filter/write=string.rot13/resource=file:///path/to/somefile.txt","Hello World");
    // Read data and encode/decode

    The first example uses a filter to encode data written to disk while the second applies two cascading filters reading from a remote URL. The outcome can be from very basic to very powerful in our applications.

    Stream Contexts

    A context is a stream-specific set of parameters or options which can modify and enhance the behavior of our wrappers. A common use context is modifying the HTTP wrapper. This lets us avoid the use of cURL for simple network operations.

    $opts = array(
    'header'=> "Auth: SecretAuthTokenrn" .
    "Content-type: application/x-www-form-urlencodedrn" .
    "Content-length: " . strlen("Hello World"),
    'content' => 'Hello World'
    $default = stream_context_get_default($opts);

    First we define our options array, an array of arrays with the format $array['wrapper']['option_name'] (the available context options vary depending on the specific wrapper). Then we call stream_context_get_default() which returns the default context and accepts an optional array of options to apply. The readfile() statement uses these settings to fetch the content.

    In the example, the content is sent inside the body of the request so the remote script will use php://input to read it. We can access the headers using apache_request_headers() and obtain:

        [Host] => localhost
        [Auth] => SecretAuthToken
        [Content-type] => application/x-www-form-urlencoded
        [Content-length] => 11

    We’ve modified the default context options, but we can create alternative contexts to be used separately as well.

    $alternative = stream_context_create($other_opts);
    readfile('http://localhost/dev/streams/php_input.php', false, $alternative);


    How can we harness the power of streams in the real world? And where can we go from here? As we’ve seen, streams share some or all of the filesystem related functions, so the first use that comes to my mind is a series of virtual filesystem wrappers to use with PaaS providers like Heroku or AppFog that don’t have a real filesystem. With little or no effort we can port our apps from standard hosting services to these cloud services and enjoy the benefits. Also – and I’ll show in a follow-up article – we can build custom wrappers and filters for our applications that implementing custom file formats and encoding.