I’ve been having a ton of fun lately with crypto, web workers and sub-optimal performance from Chrome.

Specifically, decrypting large files have been crashing Chrome or running retardedly slow, whereas the same file decrypts just fine in FireFox.

The answer for me was to use progressive ciphering. This is possible using AES and its CBC mode of operation.

The first thing we need to do is to create the ‘decryptor’. The key and initialization vector have been obtained already from our server:

var cryptor = CryptoJS.algo.AES.createDecryptor(CryptoJS.enc.Base64.parse(key), {iv: CryptoJS.enc.Base64.parse(iv)});

The format of the file contents obtained from our service is a Base64 string, so we need to first break that up into chunks.

var chunkCache = {};
var progressiveCipher = null;

var startChunk = 0;
var endChunk = 0;
var i = 0;
var chunkSize = 1024;//whatever chunk size you like.
while (startChunk < c.length) {
    endChunk = startChunk + chunkSize;
    chunkCache[i] = fileBase64String.slice(startChunk, endChunk);
    startChunk = endChunk;

We then need to loop through each chunk in the chunkCache and process it. Each chunk must be parsed from a string into a CryptoJS WordArray.

//a bit of a convoluted loop here, but we are ensuring each chunk is sorted
//in the correct order
//this means in future we could build the chunkCache up asynchronously,
//for example using a FileReader.

var mapper = function (key) {
                 return parseInt(key, null);

var sorter = function (a, b) {
                 return a - b;

_.each(_.map(_.keys(chunkCache), mapper).sort(sorter),
            function (key) {
                var chunk = CryptoJS.enc.Base64.parse(chunkCache[key]);
                var block = cryptor.process(chunk);

                if (!progressiveCipher) {
                    progressiveCipher = block;
                } else {

Now that we have iterated through and processed all the chunks we need to perform the all important finalization.

var result = progressiveCipher.concat(cryptor.finalize());

And that is it! We are now ready to process the decrypted object in whatever way we see fit.

So, just a quick one here.

I have been working on some performance issues in our application and here are some of the things I did to clean things up a bit.

  • Kill ng-animate. With fire. Just do it™.

  • Make good use of $scope.$on('$destroy') to perform cleanup.

if you are making use of $rootScope.$broadcast or jQuery’s $('#el').on('click' handler) then unhook listeners like this:

//assign your listener to a variable in your controller or directive:
 var subscription =  $scope.$on("event:MyEvent", handler);

 $("#myButton").on('click', handler);


 $scope.$on('$destroy', function () {
        subscription();//calling it will kill it.
        $("#myButton").off('click', handler);//Die jQuery handler!!!
  • Remove closures where possible. This will avoid ‘capturing’ unwanted variables and also improves readability
//turn this:

funcOne().then(function (result) {
                var pants = $scope.doCrazyThingWithGlobal(myGlobalThing);
                var omg = pants * 100;
                //maybe go crazy and hook an event or 2 up here lol
                    .then(function (anotherResult) {
                        $scope.sigh(omg, pants);
                            .then(function () {

//into this


//I've even done behavioural things like this:

var myWorkflow = [handler1, handler2, handler3];


I’ll post more as I find them, in the meantime keep your JavaScript safe, happy and more productive.

Sick of incurring the wrath of your team mates for pushing test-breaking code? Those days are behind you with Git hooks.

My use case: Working on front-end code breaks our Selenium based acceptance tests when I move widgets around. We make use of Gradle to run build tasks so why not run the test task before I push to origin?

Start by creating a hooks directory if there isn’t one already in your $PROJECT/.git folder.

Then in your terminal: touch pre-push.sh && chmod a+x pre-push.sh && ln -s ../../git-hooks/pre-push.sh pre-push.

You are ready to add logic to your hook. There are plenty of examples of different tasks you can run; from linting your JavaScript to removing comments or not pushing commits with ‘WIP’ in the message.

Here, is my logic for running Gradle’s test task:

  git stash -q --keep-index
  ./gradlew test
  git stash pop -q
  exit $RESULT

Voila. Git won’t push until your tests pass! You can bypass this hook if you want by using the following flag: git push origin master --no-verify

Have fun.

JavaScript Encryption and Decryption

In my previous post on JavaScript crypto we looked at uploading a file which was to be encrypted server-side then downloaded and decrypted in the browser. In this post we will tackle both encryption and decryption and also take a look at some nuances of the HTML5 File Api


First of all we need to get a file off the file system. In our application we are making use of Fine Uploader which takes care of lots of minutiae like chunking for us. I won’t go into detail on setting up Fine Uploader as they have very good documentation.

When a file is selected (or drag-dropped) by the user I need to intercept it, encrypt it, then send it on its merry way. Here’s the interception code:

//create a key and initializationVector .. 
//this can be done by CryptoJS or your own server-side technology
//it is easy to get a file using the HTML File Api, here I'm using Fine Uploader
var file = fineuploader.getFile(someId);

var reader = new FileReader();
reader.onload = function (e) {
    var encrypted = CryptoJS.AES.encrypt(
        //convert to a word array via CryptoJS. 'this' is the file reader.
        //our server generated key happens to be in Base64. 
        //We need to convert it to a word array
            // iv was created on our server with the key 
            iv: CryptoJS.enc.Base64.parse(initializationVector) 

    var blob = new Blob([encrypted], {type: file.type});
    //now give back to Fine Uploader to continue upload to server.
    //or you could store it locally via FileWriter


We really need to call FileReader.readAsArrayBuffer as using any of the other string based variants will mess around with encoding when encrypting.

Then we just need to construct a new Blob with the encrypted result and upload it to the server (or cloud storage of choice).


When we download from the server we need to intercept the file, decrypt the contents and then hand it off to the browser as though it is a normal file download.

Using AngularJS’s $http service:

         url: "path/to/file",
         method: "GET"
     }).then(function (response) {
         var decrypted = CryptoJS.AES.decrypt(
             //again .. we retrieved these from our server.. you may have them elsewhere
                 iv: CryptoJS.enc.Base64.parse(initializationVector)

         //we need to jump through a hoop or two here
         var blob = new Blob([new Uint8Array(toArrayBuffer(decrypted))], 
            {type: response.data.mimeType});
         var url = (window.webkitURL || window.URL).createObjectURL(blob);
         var a = $("<a>")
             .attr("id", id)
             .attr("download", filename)
             .attr("href", url)
             .attr("textContent", "Download ready")

         (window.webkitURL || window.URL).revokeObjectURL(url);

There’s a few things going on here:

  1. Download the raw encrypted content via $http
  2. We need to decrypt using the same key and initialisation vector as when we encrypted
  3. We then create a new Blob where the contents are converted to a Uint8Array using CryptoJS helpers
  4. The mime type is something we determined on upload and is stored on our server.
  5. We need to generate a Url via createObjectURL
  6. Then we are relying on an anchor tag with the download attribute from HTML5. to get a seamless download experience (you may want to use Modernizr here)
  7. Click it and watch the file pop into your (supported) browser’s download list.

That’s it .. any questions please leave a comment.

This is a brief companion piece to the previous post on writing a custom directive that deals with unit testing that directive and handling isolated scope.

I already described how to set up unit tests in a previous post so we’ll just dive right in.

First we need to set up our tests using Jasmine’s beforeEach functions:

describe('sorter', function () {
    "use strict";
    var _scope, _compile, _element, _mediator;


    beforeEach(inject(function ($rootScope, $compile, mediator) {
        _scope = $rootScope.$new({});
        _compile = $compile;
        _scope.someSortFunction = function(){};
        _mediator = mediator;
        _element = _compile("<sorter sort-by=\"name\" sort=\"someSortFunction\" column-title=\"Name\"></sorter>")(_scope);



Here I am injecting the services I need, including the custom mediator service described in the last post. Notice how I set an arbitrary function someSortFunction to the scope then configure that in the compile statement for the directive.

This shows the power of directives where you are able to isolate the scope then re-use it across applications or modules, configuring functionality appropriately.

Next we will write some tests that exercise the logic in the directive:

it("should call the configured 'sort' function on sortColumn", function(){
    var sut =  _element.isolateScope();
    var stub = sinon.stub(sut, "sort");


Here we need to work with the isolated scope as opposed to the _scope variable we set up in the initialiser functions. It is obtained by calling _element.isolatedScope and this is our ‘system under test’.

We are also making use of Sinon to stub the sort function we configured earlier so we can test that it is called in our proxy sortColumn function. This is especially useful to stub functions on custom services like the mediator we injected in earlier.

That’s it!