Using GNU Make as a Front-end Development Build Tool

Share this article

The popularity of CSS preprocessors like Sass, and task runners like Grunt have made a build process an accepted part of front-end development. There’s no shortage of options or opinions in the task-runner/build-tool space, with the most popular being Grunt and Gulp.

I write a decent amount of JavaScript at work and in personal projects, but I avoid using either Grunt or Gulp unless I absolutely have to. I prefer GNU Make, for a number of reasons:

  • It unlocks the power of the Unix shell; STDIN and STDOUT are extremely versatile.
  • It’s already available in all environments I use.
  • It either has all the tools I need, or allows them to be accessed without unnecessary wrapping modules binding me to whatever version is available for my build tool of choice.

I’m not alone in suggesting you likely don’t need all those fancy build tools. What I do see a lack of, however, are practical introductions to Make as a front-end development build tool. While I’d encourage you to RTFM, I’m not going to point you to a 186-page manual for software written in the 70’s and say “off ya go!”.

So, lets look at Make and see how it can do an efficient job of building our CSS and JavaScript assets.

Build Tools vs. Task Runners

When I had grown entirely sick of programming via declarative JSON and was re-implementing asset compilation in a Makefile, I brought with me the notion of naming tasks — names like ‘setup’, ‘css’, and ‘assets’. My Makefile targets looked something like this:

    bower install
    mkdir -p htdocs/assets/css
    mkdir -p htdocs/assets/js

I’d brought the task-runner mentality to Make. The correct use is to instead have targets paired with prerequisites, allowing Make to only build new files when the sources had changed.

Makefile Rule Basics

To use Make to its full potential, you need to properly frame your desired results as build targets, their prerequisites (dependencies), and the recipe to turn those dependencies into the intended output. ie:

htdocs/robots.txt: support-files/robots.txt
    cp support-files/robots.txt htdocs/robots.txt

An intentionally simplistic example, but it contains the core constructs of a Makefile rule. On the first line we have the target (htdocs/robots.txt). After the colon, we have the target’s prerequisites. When make parses this Makefile, it will read this rule and interpret it as “if the source has been changed since the target was generated, re-generate the target”. To regenerate the target, it executes the tab-indented lines under the target: source pair, called a recipe (Makefile recipes MUST be tab-indented, not space-indented).

The simple example above seems a bit… well, terrible really. Quite a bit of typing just to copy a file. Make improves this by providing Automatic Variables to use within recipes. I encourage you to read all of the available Automatic Variables, but the ones I use most often are:

  • $@ – the target filename
  • $< – the filename of the first prerequisite
  • $? – space-delimited list of all prerequisites

Using Automatic Variables, our previous example becomes:

htdocs/robots.txt: support-files/robots.txt
    cp $< $@

Say we then add a humans.txt to support-files, and want that published to htdocs. We don’t want to copy and paste the previous rule, but we can improve it by using stems in the target and prerequisite:

htdocs/%.txt: support-files/%.txt
    cp $< $@

Now we’re getting somewhere. The above will copy across any text files, but only when the source file changes. If you were to run it twice in a row, on the second run it would exit without copying any files.

Compiling Sass

Lets take what we’ve covered and make it useful. Assume we have the following directory structure:

|_ assets
|  |__ css
|  |__ js
|_ htdocs
|  |__ css
|  |__ js

Our source files live in assets, and our compiled results are served by our http server from the htdocs directory. Next is the Make rule for generating our CSS from the Sass source using SassC:

htdocs/css/%.css: assets/css/%.scss
    sassc -t compressed -o $@ $?

A single-command recipe, with a single prerequisite. It instructs Make: “If the source Sass file is newer than the build target file, run this sassc command with the target ($@) supplied as the output (-o option) and the prerequisite as the source”.

If you run this, you will see the command printed to the terminal, as well as any output from the sassc binary. That gets noisy fast, so we can suppress that command-echoing behavior by prepending the command with @. Lets step it up a notch and make this recipe more interesting:

htdocs/css/%.css: assets/css/%.scss
    @echo Compiling $@
	@mkdir -p $(@D)
    @sassc -t compressed -o $@ $?
    @node_modules/.bin/autoprefixer $@

The above is performing these commands:

  • Echo the destination file name so we can see what’s being compiled
  • Make the directory for the target file if it doesn’t already exist (including subdirectories!). This introduces a new Automatic Variable – $(@D) – the directory of the target file.
  • Compile the CSS from the Sass source
  • Use autoprefixer to add any required vendor prefixes to our CSS so we don’t have to put them in our Sass.

We’ll use that same approach to minify our JavaScript. You could also transpile your CoffeeScript or ES6 JavaScript if you’re using those.

htdocs/js/%.js: assets/js/%.js
    @echo Processing $@
	@mkdir -p $(@D)
    @node_modules/.bin/jsmin --level 2 --output $@ $?

I’m using Node in both the JS and CSS build processes, because that’s where the best tools for front-end development are. But that doesn’t mean I need to wrap those Node modules in Grunt or Gulp wrappers and run them inside another Node program with its own syntax.

Multiple Prerequisites

Every Sass project I’ve worked on has had variables defined in a file that needs to be imported into other Sass files. So we should account for global dependencies in our Sass compilation example.

htdocs/css/%.css: assets/css/%.scss assets/css/_settings.scss
    @echo Compiling $@
	@mkdir -p $(@D)
    @sassc -t compressed -o $@ $<
    @node_modules/.bin/autoprefixer $@

The changes to note are that we’ve now added the _settings.scss to the list of prerequisites, and we’ve changed the call to sassc to use the first prerequisite file name ($) as its input instead of using the entire prerequisite section. Now if the Sass file with the matching stem (% wildcard) or the _settings.scss file change, the target CSS file(s) will be recompiled.

Because our stem in the first prerequisite can match sub-directories, we need to make sure that the sub-directory exists in our htdocs/css directory. We do this using mkdir and the $(@D) automatic variable, which contains the directory section of the path to the target. You don’t have to use $(@D), you’d get the same result calling out to the shell with $(shell basedir $@); the automatic variable is just a convenience.


There’s too much repetition for my liking. If we change the directory structure of our project, we’re going to have to go and change every instance of those directory names. Lets clean it up by replacing the various directories with variables:

SASS_SRC := assets/css
JS_SRC   := assets/js
CSS_DIR  := htdocs/css
JS_DIR   := htdocs/js
BIN      := node_modules/bin

$(CSS_DIR)/%.css: $(SASS_SRC)/%.scss $(SASS_SRC)/_settings.scss
    @echo Compiling $@
    @mkdir -p $(@D)
    @sassc -t compressed -o $@ $<
    @$(BIN)/autoprefixer $@

$(JS_DIR)/%.js: $(JS_SRC)/%.js
    @echo Processing $@
    @mkdir -p $(@D)
    @$(BIN)/jsmin --level 2 --output $@ $?

The two new concepts here are the setting of variables using the := operator, and using variables in Make rules by wrapping them in parentheses and prefixing with a $. You don’t have to align the variable assignments like I have, that’s merely personal preference.

Bash at Your Fingertips

I’m cautious of giving the impression that your Make rules should call other programs you’ve installed to do the actual work. To demonstrate, lets use core GNU utilities to save our users and our servers multiple HTTP requests when fetching the JavaScript libraries used by our web page.

JS_LIB_FILE := $(JS_DIR)/libs.js
JS_LIBS := bower_components/jquery/dist/jquery.min.js \
    bower_components/lodash/dist/lodash.min.js \

    cat $? > $@

The result is if any of those source libraries change (e.g. new versions become available from Bower) the libs.js file will be updated too. Nice!

As a final example, here’s something more complicated, still using standard utilities — generating a manifest of all the filenames and the checksums of those files. We use a variation of this at Swiftly and other 99designs products to provide our server with the data it needs to write cache-busting asset links on a per-file basis (similar to Sprockets for Ruby).

DIST := htdocs/assets

# if any compiled assets changes, regenerate the manifest
$(DIST)/.manifest: $(shell find $(DIST) -type f -name '*.css' -or -name '*.js')
    find $(DIST) -type f -exec cksum {} \; | sed -e "s#$(DIST)/##" | cut -f1,3 -d" " > $@

The recipe is a little bit of a mind-bender at first, but remember you only have to run man find in the command prompt to learn all about find, or man sed for set, or man cut for cut. Each of the three sections can be executed on their own to learn more about them, as they’re all using STDIN and STDOUT.

Summing Up

I can appreciate that some developers already familiar with JavaScript would prefer a JavaScript build tool, insisting that they don’t want to have to learn “yet another thing”. As I’ve hopefully demonstrated here, for our purposes, Make is little more than a target/prerequisite/recipe syntax with some convenient variables populated for us and direct access to shell scripting to perform any processing we need.

Every web developer should be comfortable performing basic file tasks in the bash, so if you’ve been holding off, maybe this is a good chance to give yourself a little push in the right direction.

Andrew KrespanisAndrew Krespanis
View Author

Andrew is a UI developer at 99designs.

build toolsGruntGulptask runner
Share this article
Read Next
Get the freshest news and resources for developers, designers and digital creators in your inbox each week