Testing your TFS 2012 build workflow activities

I recently posted about setting up a solution for editing build workflows (with TFS 2012), I’m going to write directly today about testing your custom activities, because I think good guides about writing them are already out there, and there (even though they talk about TFS 2010, the logic is quite the same). Today, we’ll have a look at how I we test activities without actually starting a real build.

As I’m currently writing an activity for launching Sonar during TFS builds (you can learn about Sonar with TFS here). Although I’m not finished writing the whole stuff, I can still share a few tips I learned for testing activities.

Classic testing approach

The approach is very classic : set up a test context, exercise the test, and check the results (and clean up).

The tests rely on the ability of given by Workflow Foundation to host one’s own workflow engine. So we’ll need mainly two things :

  • Create the right workflow context to emulate the behavior of build (only what’s needed for the test of course)
  • Create an instance of the activity, passing necessary parameters and stubs to fulfill the test

During our test, the workflow engine will run the activity. The activity will use objects and value from its parameters and interact with the context. Then we can add checks about what has actually happened. Ready?

Workflow activities testing tips

Creating the activity and executing it

You can instantiate a Worklow activity with a classic new statement. Literal parameters can be passed in the constructor or affected (literals here are value types and Strings). All other objects must be passed in a Dictionnary<String, Object> structure at invocation time.

  1. // constants (literals)
  2. var activity = new Sonar
  3. {
  4.     // this is a String (a literal)
  5.     SonarRunnerPath = SonarRunnerPath,
  6.     // this is a boolean (a literal as well)
  7.     FailBuildOnError = FailBuildOnError,
  8.     GeneratePropertiesIfMissing = GeneratePropertiesIfMissing,
  9.     SonarPropertiesTemplatePath = TemplatePropertiesPath,
  10.     FailBuildOnAlert = FailBuildOnAlert,
  11.     // StringList is not a workflow literal
  12.     // the following line will cause an exception at run time
  13.     ProjectsToAnalyze = new StringList("dummy.sln")
  14. };

 

 

Here all values are booleans or strings, except one StringList that will cause an error at run time (so we must remove it). Here’s how to invoke the activity (actually a workflow composed of one activity) and pass the StringList as an argument:

  1. // object variables
  2. var parameters = new Dictionary<string, object>
  3. {
  4.     { "ProjectsToAnalyze", new StringList("dummy.sln") }
  5. };
  6.  
  7. // the workflow invoker, our workflow is composed of only one activity!
  8. WorkflowInvoker invoker = new WorkflowInvoker(activity);
  9. // executes the activity
  10. invoker.Invoke(parameters);

 

Tracking build messages

You may want to check what your activity is logging, you know, when you call the TrackBuildMessage method or use the WriteBuildMessage (or Warning or Error) activity. To do this you need to set up a recorder, or more exactly TrackingParticipant. Here is a TrackingParticipant derived class that is specialized to recording build messages:

A build message tracking class
  1. namespace MyBuilds.BuildProcess.Tests
  2. {
  3.     using System;
  4.     using System.Collections.Generic;
  5.     using System.Linq;
  6.     using System.Text;
  7.     using System.Activities.Tracking;
  8.     using Microsoft.TeamFoundation.Build.Workflow.Tracking;
  9.     using Microsoft.TeamFoundation.Build.Workflow.Activities;
  10.  
  11.     /// <summary>
  12.     /// BuildMessageTrackingParticipant that logs build messages during build workflow activities
  13.     /// </summary>
  14.     public class BuildMessageTrackingParticipant : TrackingParticipant
  15.     {
  16.         private StringBuilder _sb = new StringBuilder(4096);
  17.  
  18.         public override string ToString()
  19.         {
  20.             return _sb.ToString();
  21.         }
  22.         protected override void Track(TrackingRecord record, TimeSpan timeout)
  23.         {
  24.             var buildMessage = record as BuildInformationRecord<BuildMessage>;
  25.             if (buildMessage != null && buildMessage.Value != null)
  26.             {
  27.                 _sb.AppendLine(buildMessage.Value.Message);
  28.             }
  29.  
  30.             var buildWarning = record as BuildInformationRecord<BuildWarning>;
  31.             if (buildWarning != null && buildWarning.Value != null)
  32.             {
  33.                 _sb.AppendLine(buildWarning.Value.Message);
  34.             }
  35.  
  36.             var buildError = record as BuildInformationRecord<BuildError>;
  37.             if (buildError != null && buildError.Value != null)
  38.             {
  39.                 _sb.AppendLine(buildError.Value.Message);
  40.             }
  41.         }
  42.     }
  43. }

 
To use it, all you need is to instantiate it and pass the instance to the workflow invoker:
 
  1. var workflowLogger = new BuildMessageTrackingParticipant();
  2. invoker.Extensions.Add(workflowLogger);

 
After the test, you can get the build “log” by calling the .ToString() method on the worfklowLogger instance.
 

Setting up a custom IBuildDetail instance

 
During builds, activities regularly get the “build details”, an IBuildDetail instance that contains lots of useful contextual data. This instance comes from the worfklow context, and activities get it by using code that looks like the following:
 
  1. IBuildDetail build = this.ActivityContext.GetExtension<IBuildDetail>();

 
Thankfully, it is an interface, so it is very easy to stub. I like to use the Moq mocking framework because it is very easy (yet not very powerful, but it is perfect for classic needs). Now we need to create a stub out of the IBuildDetail interface, customizing it for your needs, and to inject it in the workflow “context”. I’ll actually assemble multiple stubs together because I also need to set up the name of the build definition for my activity (yes, the activity uses the current build definition name!):
 
  1. // the build definition stub that holds its name
  2. var buildDefinition = new Mock<IBuildDefinition>();
  3. buildDefinition.Setup(d => d.Name).Returns("My Dummy Build");
  4.  
  5. // a build detail stub with the buildnumber, build definition stub and log location
  6. var buildDetail = new Mock<IBuildDetail>();
  7. buildDetail.Setup(b => b.BuildNumber).Returns("My Dummy Build_20130612.4");
  8. buildDetail.Setup(b => b.BuildDefinition).Returns(buildDefinition.Object);
  9. buildDetail.Setup(b => b.LogLocation).Returns(Path.Combine(TestContext.TestDeploymentDir, "build.log"));
  10.  
  11. // pass the stub to the invoker extensions
  12. invoker.Extensions.Add(BuildDetail.Object);

 
Now, the activity “thinks” it is using real “build details” from the build, but during tests we are using a fake object, with just the necessary values for the test to pass. So this is actually a pure classic stubbing scenario, no more.
 

Passing TFS complex objects

 
Unfortunately, not all of the objects and classes we need in build activities are interfaces, or pure virtual classes. Those are easy to stub. In the case of objects such as a Workspace, a VersionControlServer, a WorkItemStore, or a WorkItemType, you have to use more powerful stubbing frameworks such as Microsoft Fakes or Typemock.
Let’s use Fakes since it is available the Visual Studio Premium edition.
First, locate the assembly that our target type belongs to. The Workspace class belongs to the Microsoft.TeamFoundation.VersionControl.Client assembly. Right-click it in the References of your project and add a Fakes assembly:
 
image
 
Fakes processes all types and members of this assembly and dynamically generates a new reference which contain “empty” objects, with all overridable properties and members, and compatible with the original types. All types are prefixed by Shim or Stub, and methods include types of their signatures in their names. Here is an example that illustrates how to set up a Workspace “Shim”. When we call the GetLocalItemForServerItem method, it will return the value we want, that is LocalSolutionPath:
 
  1. // our workspace stub
  2. ShimWorkspace workpace = new ShimWorkspace()
  3. {
  4.     // we override String GetLocalItemForServerItem()
  5.     // and have it return a value of our own for the test
  6.     GetLocalItemForServerItemString = (s) => LocalSolutionPath
  7. };

 
To pass the actual Workspace compatible object to our activity as a parameter, use its .Instance property. Since it is not a workflow literal, let’s use the Dictionnary like we did before:
 
  1. // object variables
  2. var parameters = new Dictionary<string, object>
  3. {
  4.     { "BuildWorkspace", workpace.Instance },
  5.     { "ProjectsToAnalyze", new StringList("dummy.sln") }
  6. };

 
 

Ok, we covered a few techniques that should allow you to test most activities now. When I’m satisfied with the tests I’m currently writing, I’ll publish them in the Community TFS Build Extensions project. So keep an eye on them if you’re are interested for a full running piece of code, sorry to make you wait!

How to integrate Sonar with TFS (part 1)

Hi! Today, I’ll briefly introduce Sonar (recently renamed SonarQube) and explain a few tips on how to deploy it on Windows, having in mind to integrate it with TFS just after.

Sonar in a nutshell

Sonar is mainly a Web portal that stores everything about your builds and helps you navigate into all this data. Quality metrics are gathered by plugins of various tools (that may not come with Sonar), into a central database. The Web portal is composed of customizable dashboards, made out of customizable widgets, which can display data in various forms, with the ability to easily compare with previous builds, or see the progression through the last days or month. A drill down logic starting from any metric (such as Line of Code, violations, unit tests and coverage, etc.) will allow you to pinpoint the projects, files, and lines of code that are at the origin of their values. Various plugins (there are commercial ones) are available: they can group projects and aggregate their data, or see stats per developer for example. You can define quality profiles and select the rules that you want to apply to your projects (each rule is tied to a plugin), and create alerts when those rules obey certain conditions (too many violations, or coverage too low for the simplest).

image

Shot taken from http://nemo.sonarqube.org

Why Sonar and TFS?

Because Sonar is a great complement to TFS. It is not always easy to get the exact report we want: you’ll find reporting services and Excel reports which have to be set up with date ranges, and solutions filters. So you may have spent quite some time to configure a SharePoint dashboard. You can’t easily set thresholds that fail your builds according to some various metrics conditions. I mean, if all of this is possible because TFS is highly customizable, it is not all centralized in a single fully featured UI, and requires to use various products or technologies. Builds do not compare to each other (only the duration, and the GUI is fixed). While Excel shines at connecting to the TFS warehouse or cube, you need to be an Excel dude in order to navigate, slice, aggregate, compare data about build results. Third party tools don’t store their data into the build reports in a structured way, so you won’t get their metrics directly in the cube. While all this is possible with, really, it is not there as easily as we would want, and that is why Sonar is becoming so popular in the .NET world (and not especially with TFS).

Keep in mind that TFS is about so much more than Sonar. TFS links Work Items to code, allowing you to get an insight of real semantics in your projects (bugs and requests influence for example). Sonar focuses *only* onto the quality of your code, instantly and over time.

So we all know that Sonar is a Java application so it is evil by essence (just kidding Winking smile), but it proves to be useful even in the .NET world, thanks to the hard work of a few pioneers to write java plugins that would launch our favorite everyday tools (FxCop, StyleCop, Gendarme) and tests frameworks (with Gallio and various coverage technologies), there it is, waiting for us.

The plan to integrate Sonar

Integrating Sonar means that our TFS Builds will launch a Sonar analysis on our projects.

image

For simplicity’s sake, I’ve not represented TFS components such as build controllers, agents, etc. What is important here, is that the TFS build calls something named “Sonar runner”. This Sonar runner launches a JVM with a bootstrap that launches each plugin you have configured in your Sonar server. Each Sonar plugin then launches the appropriate native tools, gets their results and publishes them into the Sonar server. The data is stored in the Sonar database.

Installing Sonar

I’m not actually going to guide you throughout the whole installation. There is already a pretty good documentation for this, and I’m not the first to talk about Sonar under Windows, see also this install guide as well. What is sure is that you’ll need to install the SonarQube Server, the Sonar runner, and then the plugin suite named C# Ecosystem.

Nevertheless, I will give you a few tips and configuration blocks samples that will help you. Naturally, I installed Sonar against a SQL Server 2008 R2 database Smile, so create an empty database and configure the server sonar.properties this way:

sonar.jdbc.username:     <sql server user>
sonar.jdbc.password:     <password>

sonar.jdbc.url: jdbc:jtds:sqlserver://myserver;SelectMethod=Cursor;instance=SONARINSTANCE;databaseName=Sonar

# Optional properties
sonar.jdbc.driverClassName: net.sourceforge.jtds.jdbc.Driver

You’ll need a JDBC jTDS driver for using SQL Server, which is included in the Sonar server distribution (cool!), in the extensions\jdbc-driver\mssql folder. I’m not used to creating SQL Server security accounts. Since I always go integrated security, I find that managing passwords is prehistoric and unsecure practice, but I guess I have no choice.

The LDAP plugin works well, you can also get the groups your users belong to in the Active Directory.

Here is the configuration that I used my AD (spent a few hours to make it working, so I hope it will help):

# LDAP configuration
sonar.security.realm: LDAP
sonar.security.savePassword: false
sonar.authenticator.createUsers: true

ldap.url: ldap://mydc1.mydomain.com:389
ldap.user.baseDn: ou=USERS,dc=mydomain,dc=com
ldap.user.request: (&(objectClass=user)(sAMAccountName={login})) 
ldap.user.realNameAttribute: cn
ldap.user.emailAttribute: mail
 
ldap.bindDn: CN=sonarsvcaccount,OU=SERVICES ACCOUNTS,DC=mydomain,DC=com  
ldap.bindPassword: sonarsvcpassword
ldap.group.baseDn: OU=GROUPS,dc=mydomain,dc=com
ldap.group.request: (&(objectClass=group)(member={dn}))

It is horrible and terrible, I known, I could not avoid to put the sonar service account password in the configuration file, protect this file!

Finally, set up Sonar as a service of course (with the sonar service account aforementioned).

That’s all for today folks! Next post I’ll talk about all the build and analysis stuff!