Results 1 to 25 of 44
Jan 31, 2013, 06:02 #26
- Join Date
- Dec 2003
- 7 Post(s)
- 0 Thread(s)
This has been an amazing read, I'm impressed by the magnitude of processing in this system, it makes my head spin when I try to imagine that
From my short test I found out that indeed json_encode() is faster than serialize() but also that when the size of the array doubles the time spent on json_encode() increases more than double, which suggests that the larger the input the worse the performance becomes.
I don't see how processing such large amounts of data can be done efficiently, the json parser has to do its work and there are no shortcuts. There are two solutions that come to my mind:
1) Write a php extension in C that will do the serialization and unserialization. Your extension has the potential of being faster than the built-in functions because you most probably don't need support for all datatypes and such so the code could be simplified as much as possible to suit your needs.
2) This is just speculation because I don't have enough knowledge in PHP internals - but I think the best solution would be to get rid of serializing/endoing/decoding steps altogether. If all the servers exchanging data use php scripts then I suppose php must be storing those arrays in some internal (binary?) format in memory while the script it running so if you were somehow able to grab the internal representation of the array and pass it directly to another server which would be able to inject it into its own running script then you would save a lot of unnecessary processing of the data back and forth. This would at least require a php extension or even some digging and patching of the php code and making your custom php release - but if the project is of such a large scope then maybe it would be worth investigating such options.