Reading and storing partial data from a file


I have a question about reading chunks of data from a file.

Suppose I have the following array saved in a file:

    [Japan] => Array
            [0] => 101
            [1] => 102
            [2] => 103

    [China] => Array
            [0] => 202
            [1] => 203
            [2] => 204

    [Chicago] => Array
            [0] => 303
            [1] => 304
            [2] => 305


Is there anyway I can open the file but just read and store the “Chicago” array elements in a variable? The reason I am asking is because The file many contain a huge list of arrays and reading the entire array and dumping them in the memory wont be a good idea. So how can we accomplish this?

Thanks in advance

How is the data stored in the file? What format is it in? Is it a plain text file, a PHP file, XML, JSON, etc.?

The extension of the file is .json and the data storage format would be JSON as well…


I don’t know of a native way to incrementally parse JSON data. You pretty much have to load and decode the whole thing at once.

However, it looks like there might be a few libraries kicking around that might do what you’re looking for. You’ll likely have to play around a bit.

You could mess around with using fseek(), but without reliably knowing where to seek to, it might be pointless. Either way I wanted to mention it. There are a few comments in the PHP manual that may help:

Thanks for the reply guys. I was thinking of a way to maintain a separate file that would play the role of indexing the data. Just like a file pointer which knows where to seek to just what “phpMyDirectory” suggested. For example, I’d query the index file to know exactly which position to seek to and then open the .json file with the pointer exactly to that position.

Any idea you can think this could be made possible?

Do you know of any way for formats other than json that does not need to load the entire file? For eg. is it possible for XML or plain text?

XMLReader will allow loading one element at a time. You still need to scan sequentially. If you want random access to a large set of data then use a database.

This is a complicated question.

The reason for this is due to PHP is extremely ineffective when doing file manipulations compared to many other languages.

With that said, the important factor here is how often do you plan to execute this scan/parse, and on how large data set.

In the end you have two options:

  1. Convert the data and insert in the database.
  2. If the data set is manageable, keep accessing it file based.

On both of these options you want to add another step, Caching. Though this would be even more important to add if you go with the second option.

So basically if it exist in cache, we don’t look it up again etc. Using this you can even access larger data set as files, since you only do it from time to time, instead of every time the data is required.

There is nothing wrong with returning JSON, but how you describe it as being now isn’t the best.

IMHO the ideal would be to have a page send an AJAX to a file that queries a database and returns only the information of interest.