While this is the PHP forum, I will recommend that you look for a more suited language for task. Both Python and Perl works wonder with manipulating files for example.
In addition, I would recommend trying to base most of the actual generating of the information as stored procedures if it is not already setup this way, the stored procedures will generate the actual report into a temporary table created for this specific report. Please note this is not a real temporary table, but one you created for this report and will delete after everything is completed.
Then all the other language needs to do is compile the csv files from the temporary table and append any magic if something need to be altered or combined.
Remember that it is usually much faster to run several smaller queries than one large query, so make sure you test this (The larger data set you work with, the more true this statement is).
If you decide to stick with PHP, write this as a CLI script.
Using unbuffered query is a great option when you need to get a larger data set, since it does not pre-fetch all the data, but instead feed it to you on the fly. The problem is of course that you cant do anything else with the connection. Though I would not use the mysql_ functions, consider switching to mysqli or PDO if you stick with PHP.
You are actually working against multiple time limits, PHP will have one, the web server will have one, and the database server will also have one. In reality each has various different timeout times, for various connection types. The main one I would worry about here would be the web server and database timeout times.
On another side, a good question is if you will be able to change these times? And if you are, will it be a good idea to do it? I.e. are the data accessed by other people etc. Since if you increase these timeouts it also makes it much easier to do a successful denial of service attack with limited resources, i.e. capturing all of the connections and keeping them alive.