Massive builds are a fact of life. You start with a little bit of testing, fighting to keep your builds trim, taut and terrific. Eventually you need to add those pesky UI tests in four different browsers, more tests for three different operating systems, five different databases and so on. Coping with lots of testing and massive builds is just the reality for many developer teams. If you want to test absolutely everything, you’ll eventually have to break up your build into many smaller chunks that can run in parallel. We introduced Jobs and Stage in Bamboo 2.7 to help with this.

stages.png

Instead of a Plan running all tests in serial, you now have a series of stages that run multiple Jobs in parallel. This effectively cuts your build times to a third (or a tenth, or how ever many batches you have) by breaking up your build.

More Jobs, More Problems?

Sadly, break ups are rarely problem free. If you break up your builds to run separately, how do you make sure they are all building and testing the same thing?

Strategies like sharing workspaces do not work with multiple remote agents. Using external repositories like Maven to share artifacts leads to concurrency issues. Bamboo 2.7 allows you to force all Jobs within a plan to use the same source revision, but you still have to rebuild the WAR over and over in order to be able to test anything. If this takes a long time, then suddenly the savings from breaking up your builds isn’t so great. This also means you don’t have a single artifact that you know have been through the gamut of tests and reducing the confidence you might have with releasing it continuously.

Enter Artifact Sharing…

To get around some of the above limitations, we decided to build Artifact Sharing as a feature in the upcoming Bamboo 3.0 release. Artifact Sharing allows you to use your Bamboo server as an artifact repository server that passes certain specified artifacts between Jobs running on any agent. Using the Artifact Sharing feature is pretty simple and here’s an example of how we use it in the Bamboo team.

Configuring your build script

We have a basic setup of a multi-module Maven 2 project which produces a WAR. Another separate module uses Cargo to start a Tomcat container with the WAR deployed inside it, and jWebUnit to run functional tests through Surefire.

First you need to configure your build script appropriately. Using Cargo and Maven, your pom.xml may look like this. Cargo will usually fetch the artifact to deploy from a remote repository, so we need to configure it to look at particular location on the file system. Look for a <deployer> snippet like:

Snippet 1.jpg

and replace the groupId and artifactId with a location tag

Snippet 2.jpg

Now specify the bamboo.war.override property when running the testing module to tell Cargo where the WAR to deploy is.

Configuring Bamboo

On the Bamboo side simply configure a Stage with one Job that produces the WAR. This is followed by any number of stages with multiple Jobs to consume the WAR and perform additional tests.

newstages.png

For the first stage create a Job that produces the WAR (a goal of mvn package will do). On the artifacts tab, configure the Artifact Definition and share that artifact. Sharing an artifact means that it’s available for subsequent Jobs to use and depend on.

sharing.png

The second “Functional Test” stage is where you have Jobs that test the WAR. In the artifact definition choose which artifacts the Job depends on. Effectively, what artifacts you want to copy down before the build starts and what directory to copy to. In your builder configuration, add the bamboo.war.override as a system parameter passed to the builder, so your Maven goal may look like test -Dtestng.suites=batch-1.xml -Dbamboo.war.override=atlassian-bamboo-web-app-3.0-SNAPSHOT.war

artifacts-consume.png

When you kick off the Plan the first Job will produce the WAR and upload it to Bamboo making it available for Jobs in subsequent stages. The next stage will kick off and each Job will download the Bamboo WAR using a secure token (or through an SSH tunnel from EC2). The previously configured Cargo plugin will pick up the WAR, start it up in a Tomcat container and start running functional tests against it. Since the Jobs running the tests are completely independent of one another, they can run on different agents and completely in parallel. The next stage continues in same way. At the end when all stages have completed successfully, you have the WAR from the original stage is guaranteed to have been tested thoroughly.

This approach means that we can test a single artifact without rebuilding it over and over. This saves time and ensures that the exact same artifact tested in all Jobs is the exact same artifact your customers are going to be using. Each artifact is isolated to each build so you know exactly what you’re testing and have no chance of suffering from concurrency issues. It also works on any agent topology, whether they are local, remote or on EC2 agents and means you can spread the load of your builds across remote servers without worrying.

Continuous Deployment and More!

You can extend the same technique to more than just distributed testing. Configure another Stage to SCP the artifact to a remote server for Continuous Releases or have a Job that is scripted to automatically deploy the artifact to a QA system for Continuous deployment. There’s some real possibilities here.

If you’re interested in trying this out and helping us build a better solution for you please drop us an email or check out our EAP!