I have a system set up a bit like a tree, where the trunk is the start and end point.
Request goes to the controller (1)
That controller starts up multiple sub-controllers (N)
Each sub-controller starts some workers (n)
The workers return the data to the sub-controller (which does it's magic), which in turn returns the data to the main controller, so it can do it's magic before returning it to the user.

These scripts are spread between multiple servers (on the same GB network), usually there about 900 scripts started for every request, and the data passed between scripts is usually under 1MB (multi-dimensional arrays making up objects )

Right now, the way I pass the data is by json_encode in the worker and json_decode in the parent.
But, this is #1 to slow (and about 5x faster than serialize) and #2 takes WAY to much RAM (sometimes for 500KB of values it takes 60mb of ram, and this is per worker/child).

From one request that takes about 20sec, 10 to 15sec is usually only this json_encode/json_decode part.

So the question is:
- Is there a better way to transfer this data from one script to another (I need to use all the data in each script, so can't pass the ID and select from the global cache/db)

Please reply.