scriptcs: Acceptance tests with XBehave.net

One of the challenges we face when merging pull requests in scriptcs is making sure all features are still working. Executing our unit tests for every pull request definitely helps as it helps catch many mistakes, but some others go unnoticed due to integration issues (different components, the file server, assembly versions).

Fortunately, scriptcs is something that can be easily tested as it does not have any external dependencies (other than NuGet) such as databases, web services, etc. That's why we decided to create some acceptance tests.

An initial approach

Running scriptcs –install takes a packages.config in the current working directory and installs all the packages specified in it. The first attempt at testing this was the following one:

The above sample uses XUnit, takes advantage of the Dispose method to do cleanup and verifies that the correct packages are installed. Nevertheless, we can take this for another spin to see if we can get a test that is even easier to read.

Enter XBehave

After making the above sample available through Twitter for feedback @glennblock brought XBehave into the table, which is being used in his (and other's) Web API book:

From its GitHub site XBehave is A BDD/TDD framework based on xUnit .net and inspired by Gherkin. The goal of this blog post is not to explain how to use XBehave, but if you are interested this site has a quickstart page and some useful docs to get you started with it.

The nice thing about it, is that you can write a spec using plain text and then map that to actual code. For example, the spec for this feature could be something like this:

Given a current working directory
And a packages.config file located in the current directory
   When the packages are installed
       Then the program executes successfully
And a packages directory is created inside the working directory
And a directory for the package and each of its dependencies is created

The resulting test case is the following one:

Benefits

The things I like better about the XBehave test are:

  1. The Teardown method is closer to the code that creates the items that must be cleaned up.
  2. Each line of code is placed within the related spec string. Before they could be anywhere, but having the strings in there helped me realized where it made better sense to put them.
  3. The scenario under test is easier to understand for everyone reading the test.