After playing around with Protractor for a while I wanted to create some integration tests where I would bootstrap the entire server/database before each test to get a consistent state to start from in each test. The problem is that we often have to wait for everything to setup before running the tests. Using Jasmine as test framework the obvious solution for this would be to use the async callback in beforeEach do something like:

beforeEach((done) ->
    server.start(() ->
        done()
    )
)

and server.start:

start = (done) ->
    server = spawn('coffee', [ 'server.coffee' ])
    server.stdout.on('data', (data) ->
        if data.toString() == 'App started\n'
            done()
    )

Simply put, the server spawns a new process that starts our server using the command: ‘coffee server.coffee’ and then listen to stdout for the log output ‘App started\n’ and then issues the callback done (this is of course purely an example).

The problem is that Protractor overrides the async callback in beforeEach and never waits for our server to actually start. The reason is that Protractor helps you with handling all the async calls of WebdriverJS and creates its own control flow using promises that overrides Jasmines async callback. Instead of using the Jasmine async callback we need to make the server.start return a promise and then add that to the Protractor control flow.

First we need to import q (the Promise module) into our project using:

npm install q

Then we modify our server.start to return a promise instead of using the callback:

Q = require('q')

start = () ->
    deferred = Q.defer()
    server = spawn('coffee', [ 'server.coffee' ])
    server.stdout.on('data', (data) ->
        if data.toString() == 'App started\n'
            deferred.resolve()
    )

    return deferred.promise

Finally we add that to the Protractor control flow in the beforeEach function:

beforeEach(() ->
    protractor.promise.controlFlow().execute(() ->
        return server.start()
    )
)

Voila! Now Protractor waits for our server to actually start before continuing with running the tests. Using this approach we can create integration tests that bootstraps the server before each test and completely isolates tests from each other enabling us to create better test scenarios that don’t depend on what order the tests are runned.