It’s quite hard to write maintainable javascript code – or rather it is very easy to do it wrong. The problem is that JavaScript was not designed for scale: there is no notion of modules or packages, undeclared variables go straight into the global scope (in fact it’s very tempting to use global scope all the time), the lack of enforced structure encourages to write long complex chunks of code…

Imagine a team of dozen developers struggling with files stuffed with thousands lines of spaghetti code and then you see where it all can lead to (perhaps – in the worse case – you don’t even need to run your imagination).

Despite its deficiencies JavaScript is a great language and fortunately – with a small dose of discipline and effort – we can overcome the limitations using appropriate libraries, frameworks and practices.


First step towards maintainability is to take a modular approach. A module is a small unit of code responsible for handling one task within given functional area. Modules should be isolated one from another, have clearly defined purpose and communicate only through well-defined public interface.

Modular organization provides obvious benefits over global scope: as all internals of a module are inside their own scope there’s less chance of accidental name clashes with other pieces of code, what’s more it discourages from accessing other’s internal data directly thus eliminates tight coupling. Besides that if your code has a clear structure it is far easier to comprehend for other team members – especially new ones.

Plain JavaScript

In order to implement simple modularity we don’t really need any third party library or framework – plain javascript with the Self-Executing Anonymous Function pattern should be enough:

var feature = ( function(globalScope, jQuery, /* other dependencies… */){
    // internals:
    var foo, bar;
    var method = function() {...};
    var method2 = function() {...};

    // public interface:
    return {
        publicFoo: foo,
        publicMethod: method
}) (window, $, /* other dependencies… */);

In the example above the “feature” object exposes only two public properties – the rest is hidden. Note also that it’s a good practice to list all necessary global objects (like “window” or jQuery) as explicit dependencies – this way our code has clearly defined input and output without hidden dependencies, which would make it harder to test.

AMD style modules

There are of course many libraries we can use to organize our code into modules – one of them is RequireJs, which as an additional benefit offers automatic loading of dependencies.

In the example below we declare a module called “moduleA” which depends on modules B and C whose code will be read from files “./moduleB.js” and “./moduleC.js” and their references will be passed as parameters to the provided function:

define("moduleA",  [“moduleB”, “moduleC”],  function(B, C) {
    return {...};

And if we only want to use some modules without actually creating a new one:

require([“moduleB”, “moduleC”],  function(B, C) { ... })

CommonJS modules

Previous example followed the Asynchronous Module Definition (AMD) format, but it’s worth to mention another standard called CommonJs, which is popular in server-side applications. Here each module is contained in its own file and uses global “exports” object to register its public properties:

File “feature.js”: = ...
exports.yyy = ...

In order to use a module we have to request it explicitly:

var feature = require('./feature'); ...

AngularJS modules

And finally an example for AngularJs with a declaration of a module A depending on two others modules B and C:

angular.module(moduleA, ['moduleB', 'moduleC']);

An important note here is that AngularJs, despite having similar style to RequireJs, doesn’t do any code loading: it provides only dependency injection mechanism and it’s the application developer responsibility to assure that all code is loaded before needed.

Loose coupling

So far we have seen how modules can call each other using references, but there also exists a higher level of loose coupling, which doesn’t even require that modules know of each other.

In this approach communication consists of publishing and receiving messages – that model is especially useful when we need to distribute data to many recipients and it allows to avoid the star topology, i.e. one module keeping references to all of the others. Here are some examples how we can implement this model using various frameworks:

JQuery Events

// publish:

$.event.trigger("prefix.event_name", ["arg1", "arg2"]);
// receive:
$(document).on("prefix.event_name", function(event, arg1, arg2) { ... });

The downside here is that JQuery requires the publish/subscribe mechanism to be bound to some DOM element (“document” in this example). Fortunately other libraries are free from such limitations – below we can see some more self-explanatory examples for AmplifyJs and AngularJs:


amplify.publish("event", "arg1", "arg2")

amplify.subscribe("event", function(arg1, arg2) { ... })


$rootScope.$broadcast("event",   "arg1", "arg2")    

$scope.$on("event", function(arg1, arg2) { ... })


Asynchronous events handlers constitute another area that very easily can get messy: chained Ajax calls often result in nested inline callback functions going on for hundreds of lines with indentation level reaching the very right edge of wide-screen monitor. To increase the readability and maintainability of such code we can use promises and chain them together keeping on the same level of indentation all the time. A promise is a placeholder for the real result that can arrive later at some point and by applying some transformation function to a promise we get another promise returning transformed result. This way, when a promise is eventually filled with the result, the value flows down the promises chain with transformations being applied on each stage.

Let’s see some example code taking advantage of the Q.js library (

doFirstAjaxCall()       //assume this returns a promise
.then( function(response1) {
    //this will happen only after receiving response from the first call
.then( function(response2){ ... })
.fail( function(error) {
    /* any error breaks the chain and can be handled here … */

.fin( function() {
    /* finally: clean up resources */

.done(); //-> ends chain

But how do we get the first promise to start the chain at all? Although it’s possible to represent an ordinary XMLHttpRequest as a promise (see for details) fortunately popular libraries/frameworks like jQuery or Angular provide their own implementations of promises, so instead of falling into callback hell like here:

$.get( "/resource1", function( response1 ) {
    $.get( "/resource2/"+response1, function( response2 ) {

We can write it in the more promise (and eyes friendly) way:

$.get( "/resource1")
.then( function( response1 ) {
    return $.get( "/resource2/"+response1);
.then ( function( response2 ){