By Craig Buckler

How to Roll Your Own JavaScript Compressor with PHP and the Closure Compiler

By Craig Buckler

In my previous post, I discussed the Closure Compiler’s REST API. In this article, we’ll develop a small PHP program that shows how the API can be used to compress JavaScript code whenever you need it.

Why write your own system?

You’ll find that several free tools handle this task; one of the first PHP JavaScript compressors was written by Ed Eliot. Sometimes, however, they require technologies you don’t use — such as Java — or may not cater for internal workflow processes, including:

  • distributing uncompressed JavaScript to developer PCs
  • integration with source control systems
  • automating your build, and so on
Important: This is not production-level code!

The code below implements a rudimentary JavaScript compressor in PHP to illustrate the basics. The code calls the Closure Compiler compression API every time a request is made, so it’s likely to be slower than multiple uncompressed JavaScript files. You should add your own functions to handle caching to a file or database.


Assume you normally include the following script tags at the bottom of your HTML:

<script src="script/file1.js"></script><script src="script/file2.js"></script><script src="script/file3.js"></script>

We’ll replace this block with a single script tag that references all three files.
The .js is removed and the file names are separated with an ampersand:

<script src="script/?file1&file2&file3"></script>


We now require an index.php file within our script folder. The first code block transforms the GET parameters into an array (file1, file2, file3) and initializes the variables:

<?php// fetch JavaScript files to compress$jsfiles = array_keys($_GET);$js = '';		// code to compress$jscomp = '';	        // compressed JS$err = '';	        // error string

We now loop through the JS files, read the content, and append it to the $js string. If a file is unable to be found or read, its name is appended to the $err string:

// fetch JavaScript filesfor ($i = 0, $j = count($jsfiles); $i < $j; $i++) {  $fn = $jsfiles[$i] . '.js';  $jscode = @file_get_contents($fn);  if ($jscode !== false) {    $js .= $jscode . "n";  }  else {    $err .= $fn . '; ';  }}

If any files are missing, we can generate a JavaScript alert to inform the developer:

if ($err != '') {  // error: missing files  $jscomp = "alert('The following JavaScript files could not be read:\n$err');";}

If there are no errors and we have some JavaScript code, we can proceed with the compression. The $apiArgs array contains a list of Closure Compiler API options — you can add, remove, or modify these as necessary. The arguments are encoded and appended to the $args string:

else if ($js != '') {  // REST API arguments  $apiArgs = array(    'compilation_level'=>'ADVANCED_OPTIMIZATIONS',    'output_format' => 'text',    'output_info' => 'compiled_code'   );   $args = 'js_code=' . urlencode($js);   foreach ($apiArgs as $key => $value) {     $args .= '&' . $key .'='. urlencode($value);   }

We can now call the Closure Compiler API using PHP’s cURL library. The compressed JavaScript is returned to the $jscomp string:

  // API call using cURL  $call = curl_init();  curl_setopt_array($call, array(    CURLOPT_URL => '',    CURLOPT_POST => 1,    CURLOPT_POSTFIELDS => $args,    CURLOPT_RETURNTRANSFER => 1,    CURLOPT_HEADER => 0,    CURLOPT_FOLLOWLOCATION => 0  ));  $jscomp = curl_exec($call);  curl_close($call);

Finally, we return our compressed code to the browser with the appropriate MIME type:

}// output content header('Content-type: text/javascript'); echo $jscomp; ?> 

Download the Code

Save yourself some typing and download the example code. It includes a small JavaScript library shrunk to a third of its original size, and incurs fewer HTTP requests.

Over to You …

You can now adapt this basic code to implement features such as:

  • Error handling — Your code should check for API call failures or compression problems reported by the Closure Compiler.
  • Caching — Once you have the compressed code you can save it to a file, so there’s no need to repeatedly call the API. You could compare creation dates to check whether a JavaScript file has changed since it was last cached.
  • Browser caching — HTTP expiry headers can be set so that compressed JavaScript files are cached by the browser indefinitely. You could add a "last-updated" argument to the script tag URL to ensure more recent code is always loaded.
  • JavaScript code reports — The Closure Compiler API can be used to highlight problems not reported by browser parsers; for example, unused variables or a comma after the final item in an array.
  • Build processes — Your system could distribute uncompressed JavaScript code to a developer and compressed code to everyone else.

I hope you find it useful. Will you use the Closure Compiler API to automate your JavaScript compression processes? Respond via the blog entry link below.

  • jaco.nel007

    Great article. Caching of the compiled files is a MUST. if not, the performance you gain by the compressed js files will be lost to the overhead created to compress it via the API.

    • Absolutely (see the note at the top).

      Whether compression occurs and how you cache the compressed JavaScript is up to you. If you cache to a file, you could compare its date against those of the original source files. If any are later, you know the files must be re-compressed.

  • Jonas Lejon

    Using some safe precautions like @file_get_contents(basename($fn)); should be safer

  • Michael Bolin

    Unfortunately, this process will be very slow because you have to POST all of your source code to Google. If you are building a substantial web application (rather than a pet project), you will quickly run into the size limitations of the Closure Compiler API/AppEngine, in which case this will not work at all.

    plovr ( is designed to make it possible to use the Compiler locally with minimal setup and fast recompilation. It will work alongside whichever web framework you are using: PHP, Python/Django, etc.

  • Rudie

    Much more important than compressing, is client side caching. If you don’t send the right headers, every request the – still gigantic? – script will be downloaded again = BIIIG waste of resources.
    And that’s the hard part. And there’s nothing about that here :(

    • As mentioned, it depends on how and when you want to handle the caching. It may be to a file, database, source control system, FTP’d etc. It’s up to you.

      As for it being hard, just how difficult is: file_put_contents('mycachefile.js', $jscomp);?

      You’ve also got to read that file or determine whether the JavaScript should be re-cached but, again, that’ll be dependent on your requirements. It’s as hard as you want to make it.

  • Schepp

    You can have all this already put together into a simple PHP-library, with server-side and client-side caching and even working without CURL here:

  • Another option: Minify 2.1.4 can now use the CC API using this class. If the API is unreachable or over quota it falls back to JSMin or a function of your choice.

  • Muhammad Azeem
Get the latest in Front-end, once a week, for free.