How to integrate Sonar with TFS (part 2)

In a recent post of mine, I introduced quickly Sonar, and how we can integrate it with TFS. I gave a few tips for its installation. Now it is time to dive in the build process and add the logic we need to make TFS builds use Sonar conveniently.

Are your really sure?

Using Sonar as I’m proposing will deport some processing from TFS to Sonar, which means TFS loses control over a certain things: unit tests are no longer launched by TFS. You *have* to use Gallio Test runner because other test reports format are not *yet* supported. The best bet is then to let Sonar launch tests through Gallio, which may have side effects!

Code analysis can no longer be configured from your projects settings, nor from the build settings (I mean you don’t want to launch it twice), so deactivate it in your builds (pick the Never option), and let Sonar launch FxCop instead.

The tooling is not always up to date, MsTest with VS 2012 was not working when it came out, which means you can be stuck if you upgrade too early. We are in the open source world, we have no guarantee it will work seamlessly with other technologies, unless you contract for a commercial support with SonarSource.

You’re still reading? Ok, you’re a pioneer now, you’ll need a good army knife and some will to make things work in your environment. Hopefully you won’t regret because it will pay off.

Build server tooling

Some tools and scripts need to be deployed onto every build server. We aim at xcopy deployment, I advise to opt for a similar folder structure that you copy from server to server. I pompously named the parent folder “Soner.Net”, and here is its contents:


I cheated a bit with some products, took their installation from the “Program Files” folder on my local machine and copied them in this structure. It just worked for me.

I prefer having a private version of Java running then Sonar analysis. For this, you can just customize the sonar-runner.cmd file and update the PATH, and JAVA_HOME to point to the subfolder where you uncompressed your JDK (not JRE). Be aware of using absolute paths here!

The sonar properties

All your project parameters for the sonar analysis are in a file. You should place this file next to the solution file you want to analyze. I see two options for managing those files:

  1. Create a file for each .NET solution and add it in the source controller
    • Sounds reasonable if you don’t have too many projects
  2. Generate them automatically!
    • But this requires some build customization

The good news is that I’ve written a build Workflow Sonar activity that will generate the properties for you, or at least, help you to generate them. It is very simple, it takes a template file and replaces a few values for you. The path to the template file must be configured in the build process workflow.

Here is the template I’ve set up for using with TFS, a sample sonar-properties.template file:

# Project identification

# Info required for Sonar

# If multiple solutions in folder, use the following line to disambiguate


sonar.fxcop.installDirectory=D:/Sonar.Net/Microsoft Fxcop 10.0

sonar.gallio.filter=exclude Type:/\.Integration\./


As you can see, there are values that will be replaced at run time. You can see their description on the activity documentation page. The trick with TFS builds is that the output folder for projects is forced to a “Binaries” folder outside the scope of your sources! This template assumes it is running from a TFS build.

It should work locally

To test all these tools, fortunately, you don’t have to run builds. First, create such a properties file based on your values for a project you want to test. Make sure you comment the sonar.dotnet.assemblies and sonar.dotnet.test.assemblies properties since Sonar C# Ecosystem guess them right when TFS is not overriding the output paths. Then, in a command prompt, after having compiled the project, move to your project folder. From there, invoke the file. It should work.

Once this works for you, you are close to make it running into TFS builds, because all we need now is to launch this command line from our builds, and eventually generate the properties dynamically.

Modifying your build template

For this, you need to get the latest Community TFS Build Extensions, and deploy them into your build controller custom assemblies source control folder. See my guide here if you’re not at ease with setting up the solution for editing build templates. You may clone the DefaultTemplate.xaml and start editing it. Once your ready to inject the Sonar activity in your build template, locate the Run On Agent => Try Compile, Test, and Associate Changesets and Work Items activity, it contains the Sequence as illustrated below. You should be able to drag the “Sonar” activity as indicated.


A nice and very simple idea is to add a boolean workflow Argument named “RunSonarAnalysis”.


Then encapsulate the Sonar activity into an If activity.


If you add the proper Metadata (locate the Metadata Argument, edit it, and add the RunSonarAnalysis argument in the list) for this parameter, you’ll be able to control the Sonar execution from your build definition! That is the start of a real integration.

Finally, edit the Sonar activity properties, and you’re all set!



Now you can decide to run Sonar from your build definitions!

You may add to your workflow custom parameters (with Metadata), and pass them directly to the Sonar activity. This would allow you to pass values such as “active” or “skip” to enable or disable the plugins of your choice, on a per project basis.

It is not as complicated as it sounds to run Sonar from TFS builds. There are things that can be done better, so stay tuned for future improvements with this activity!

Testing your TFS 2012 build workflow activities

I recently posted about setting up a solution for editing build workflows (with TFS 2012), I’m going to write directly today about testing your custom activities, because I think good guides about writing them are already out there, and there (even though they talk about TFS 2010, the logic is quite the same). Today, we’ll have a look at how I we test activities without actually starting a real build.

As I’m currently writing an activity for launching Sonar during TFS builds (you can learn about Sonar with TFS here). Although I’m not finished writing the whole stuff, I can still share a few tips I learned for testing activities.

Classic testing approach

The approach is very classic : set up a test context, exercise the test, and check the results (and clean up).

The tests rely on the ability of given by Workflow Foundation to host one’s own workflow engine. So we’ll need mainly two things :

  • Create the right workflow context to emulate the behavior of build (only what’s needed for the test of course)
  • Create an instance of the activity, passing necessary parameters and stubs to fulfill the test

During our test, the workflow engine will run the activity. The activity will use objects and value from its parameters and interact with the context. Then we can add checks about what has actually happened. Ready?

Workflow activities testing tips

Creating the activity and executing it

You can instantiate a Worklow activity with a classic new statement. Literal parameters can be passed in the constructor or affected (literals here are value types and Strings). All other objects must be passed in a Dictionnary<String, Object> structure at invocation time.

  1. // constants (literals)
  2. var activity = new Sonar
  3. {
  4.     // this is a String (a literal)
  5.     SonarRunnerPath = SonarRunnerPath,
  6.     // this is a boolean (a literal as well)
  7.     FailBuildOnError = FailBuildOnError,
  8.     GeneratePropertiesIfMissing = GeneratePropertiesIfMissing,
  9.     SonarPropertiesTemplatePath = TemplatePropertiesPath,
  10.     FailBuildOnAlert = FailBuildOnAlert,
  11.     // StringList is not a workflow literal
  12.     // the following line will cause an exception at run time
  13.     ProjectsToAnalyze = new StringList("dummy.sln")
  14. };



Here all values are booleans or strings, except one StringList that will cause an error at run time (so we must remove it). Here’s how to invoke the activity (actually a workflow composed of one activity) and pass the StringList as an argument:

  1. // object variables
  2. var parameters = new Dictionary<string, object>
  3. {
  4.     { "ProjectsToAnalyze", new StringList("dummy.sln") }
  5. };
  7. // the workflow invoker, our workflow is composed of only one activity!
  8. WorkflowInvoker invoker = new WorkflowInvoker(activity);
  9. // executes the activity
  10. invoker.Invoke(parameters);


Tracking build messages

You may want to check what your activity is logging, you know, when you call the TrackBuildMessage method or use the WriteBuildMessage (or Warning or Error) activity. To do this you need to set up a recorder, or more exactly TrackingParticipant. Here is a TrackingParticipant derived class that is specialized to recording build messages:

A build message tracking class
  1. namespace MyBuilds.BuildProcess.Tests
  2. {
  3.     using System;
  4.     using System.Collections.Generic;
  5.     using System.Linq;
  6.     using System.Text;
  7.     using System.Activities.Tracking;
  8.     using Microsoft.TeamFoundation.Build.Workflow.Tracking;
  9.     using Microsoft.TeamFoundation.Build.Workflow.Activities;
  11.     /// <summary>
  12.     /// BuildMessageTrackingParticipant that logs build messages during build workflow activities
  13.     /// </summary>
  14.     public class BuildMessageTrackingParticipant : TrackingParticipant
  15.     {
  16.         private StringBuilder _sb = new StringBuilder(4096);
  18.         public override string ToString()
  19.         {
  20.             return _sb.ToString();
  21.         }
  22.         protected override void Track(TrackingRecord record, TimeSpan timeout)
  23.         {
  24.             var buildMessage = record as BuildInformationRecord<BuildMessage>;
  25.             if (buildMessage != null && buildMessage.Value != null)
  26.             {
  27.                 _sb.AppendLine(buildMessage.Value.Message);
  28.             }
  30.             var buildWarning = record as BuildInformationRecord<BuildWarning>;
  31.             if (buildWarning != null && buildWarning.Value != null)
  32.             {
  33.                 _sb.AppendLine(buildWarning.Value.Message);
  34.             }
  36.             var buildError = record as BuildInformationRecord<BuildError>;
  37.             if (buildError != null && buildError.Value != null)
  38.             {
  39.                 _sb.AppendLine(buildError.Value.Message);
  40.             }
  41.         }
  42.     }
  43. }

To use it, all you need is to instantiate it and pass the instance to the workflow invoker:
  1. var workflowLogger = new BuildMessageTrackingParticipant();
  2. invoker.Extensions.Add(workflowLogger);

After the test, you can get the build “log” by calling the .ToString() method on the worfklowLogger instance.

Setting up a custom IBuildDetail instance

During builds, activities regularly get the “build details”, an IBuildDetail instance that contains lots of useful contextual data. This instance comes from the worfklow context, and activities get it by using code that looks like the following:
  1. IBuildDetail build = this.ActivityContext.GetExtension<IBuildDetail>();

Thankfully, it is an interface, so it is very easy to stub. I like to use the Moq mocking framework because it is very easy (yet not very powerful, but it is perfect for classic needs). Now we need to create a stub out of the IBuildDetail interface, customizing it for your needs, and to inject it in the workflow “context”. I’ll actually assemble multiple stubs together because I also need to set up the name of the build definition for my activity (yes, the activity uses the current build definition name!):
  1. // the build definition stub that holds its name
  2. var buildDefinition = new Mock<IBuildDefinition>();
  3. buildDefinition.Setup(d => d.Name).Returns("My Dummy Build");
  5. // a build detail stub with the buildnumber, build definition stub and log location
  6. var buildDetail = new Mock<IBuildDetail>();
  7. buildDetail.Setup(b => b.BuildNumber).Returns("My Dummy Build_20130612.4");
  8. buildDetail.Setup(b => b.BuildDefinition).Returns(buildDefinition.Object);
  9. buildDetail.Setup(b => b.LogLocation).Returns(Path.Combine(TestContext.TestDeploymentDir, "build.log"));
  11. // pass the stub to the invoker extensions
  12. invoker.Extensions.Add(BuildDetail.Object);

Now, the activity “thinks” it is using real “build details” from the build, but during tests we are using a fake object, with just the necessary values for the test to pass. So this is actually a pure classic stubbing scenario, no more.

Passing TFS complex objects

Unfortunately, not all of the objects and classes we need in build activities are interfaces, or pure virtual classes. Those are easy to stub. In the case of objects such as a Workspace, a VersionControlServer, a WorkItemStore, or a WorkItemType, you have to use more powerful stubbing frameworks such as Microsoft Fakes or Typemock.
Let’s use Fakes since it is available the Visual Studio Premium edition.
First, locate the assembly that our target type belongs to. The Workspace class belongs to the Microsoft.TeamFoundation.VersionControl.Client assembly. Right-click it in the References of your project and add a Fakes assembly:
Fakes processes all types and members of this assembly and dynamically generates a new reference which contain “empty” objects, with all overridable properties and members, and compatible with the original types. All types are prefixed by Shim or Stub, and methods include types of their signatures in their names. Here is an example that illustrates how to set up a Workspace “Shim”. When we call the GetLocalItemForServerItem method, it will return the value we want, that is LocalSolutionPath:
  1. // our workspace stub
  2. ShimWorkspace workpace = new ShimWorkspace()
  3. {
  4.     // we override String GetLocalItemForServerItem()
  5.     // and have it return a value of our own for the test
  6.     GetLocalItemForServerItemString = (s) => LocalSolutionPath
  7. };

To pass the actual Workspace compatible object to our activity as a parameter, use its .Instance property. Since it is not a workflow literal, let’s use the Dictionnary like we did before:
  1. // object variables
  2. var parameters = new Dictionary<string, object>
  3. {
  4.     { "BuildWorkspace", workpace.Instance },
  5.     { "ProjectsToAnalyze", new StringList("dummy.sln") }
  6. };


Ok, we covered a few techniques that should allow you to test most activities now. When I’m satisfied with the tests I’m currently writing, I’ll publish them in the Community TFS Build Extensions project. So keep an eye on them if you’re are interested for a full running piece of code, sorry to make you wait!

How to integrate Sonar with TFS (part 1)

Hi! Today, I’ll briefly introduce Sonar (recently renamed SonarQube) and explain a few tips on how to deploy it on Windows, having in mind to integrate it with TFS just after.

Sonar in a nutshell

Sonar is mainly a Web portal that stores everything about your builds and helps you navigate into all this data. Quality metrics are gathered by plugins of various tools (that may not come with Sonar), into a central database. The Web portal is composed of customizable dashboards, made out of customizable widgets, which can display data in various forms, with the ability to easily compare with previous builds, or see the progression through the last days or month. A drill down logic starting from any metric (such as Line of Code, violations, unit tests and coverage, etc.) will allow you to pinpoint the projects, files, and lines of code that are at the origin of their values. Various plugins (there are commercial ones) are available: they can group projects and aggregate their data, or see stats per developer for example. You can define quality profiles and select the rules that you want to apply to your projects (each rule is tied to a plugin), and create alerts when those rules obey certain conditions (too many violations, or coverage too low for the simplest).


Shot taken from

Why Sonar and TFS?

Because Sonar is a great complement to TFS. It is not always easy to get the exact report we want: you’ll find reporting services and Excel reports which have to be set up with date ranges, and solutions filters. So you may have spent quite some time to configure a SharePoint dashboard. You can’t easily set thresholds that fail your builds according to some various metrics conditions. I mean, if all of this is possible because TFS is highly customizable, it is not all centralized in a single fully featured UI, and requires to use various products or technologies. Builds do not compare to each other (only the duration, and the GUI is fixed). While Excel shines at connecting to the TFS warehouse or cube, you need to be an Excel dude in order to navigate, slice, aggregate, compare data about build results. Third party tools don’t store their data into the build reports in a structured way, so you won’t get their metrics directly in the cube. While all this is possible with, really, it is not there as easily as we would want, and that is why Sonar is becoming so popular in the .NET world (and not especially with TFS).

Keep in mind that TFS is about so much more than Sonar. TFS links Work Items to code, allowing you to get an insight of real semantics in your projects (bugs and requests influence for example). Sonar focuses *only* onto the quality of your code, instantly and over time.

So we all know that Sonar is a Java application so it is evil by essence (just kidding Winking smile), but it proves to be useful even in the .NET world, thanks to the hard work of a few pioneers to write java plugins that would launch our favorite everyday tools (FxCop, StyleCop, Gendarme) and tests frameworks (with Gallio and various coverage technologies), there it is, waiting for us.

The plan to integrate Sonar

Integrating Sonar means that our TFS Builds will launch a Sonar analysis on our projects.


For simplicity’s sake, I’ve not represented TFS components such as build controllers, agents, etc. What is important here, is that the TFS build calls something named “Sonar runner”. This Sonar runner launches a JVM with a bootstrap that launches each plugin you have configured in your Sonar server. Each Sonar plugin then launches the appropriate native tools, gets their results and publishes them into the Sonar server. The data is stored in the Sonar database.

Installing Sonar

I’m not actually going to guide you throughout the whole installation. There is already a pretty good documentation for this, and I’m not the first to talk about Sonar under Windows, see also this install guide as well. What is sure is that you’ll need to install the SonarQube Server, the Sonar runner, and then the plugin suite named C# Ecosystem.

Nevertheless, I will give you a few tips and configuration blocks samples that will help you. Naturally, I installed Sonar against a SQL Server 2008 R2 database Smile, so create an empty database and configure the server this way:

sonar.jdbc.username:     <sql server user>
sonar.jdbc.password:     <password>

sonar.jdbc.url: jdbc:jtds:sqlserver://myserver;SelectMethod=Cursor;instance=SONARINSTANCE;databaseName=Sonar

# Optional properties
sonar.jdbc.driverClassName: net.sourceforge.jtds.jdbc.Driver

You’ll need a JDBC jTDS driver for using SQL Server, which is included in the Sonar server distribution (cool!), in the extensions\jdbc-driver\mssql folder. I’m not used to creating SQL Server security accounts. Since I always go integrated security, I find that managing passwords is prehistoric and unsecure practice, but I guess I have no choice.

The LDAP plugin works well, you can also get the groups your users belong to in the Active Directory.

Here is the configuration that I used my AD (spent a few hours to make it working, so I hope it will help):

# LDAP configuration LDAP false
sonar.authenticator.createUsers: true

ldap.url: ldap://
ldap.user.baseDn: ou=USERS,dc=mydomain,dc=com
ldap.user.request: (&(objectClass=user)(sAMAccountName={login})) 
ldap.user.realNameAttribute: cn
ldap.user.emailAttribute: mail
ldap.bindDn: CN=sonarsvcaccount,OU=SERVICES ACCOUNTS,DC=mydomain,DC=com  
ldap.bindPassword: sonarsvcpassword OU=GROUPS,dc=mydomain,dc=com (&(objectClass=group)(member={dn}))

It is horrible and terrible, I known, I could not avoid to put the sonar service account password in the configuration file, protect this file!

Finally, set up Sonar as a service of course (with the sonar service account aforementioned).

That’s all for today folks! Next post I’ll talk about all the build and analysis stuff!

Build your old VB6 projects with TFS 2012

or How to edit a TFS 2012 build process template

A few weeks ago, I blogged about using VB6 with TFS 2012. The build process I proposed was relying on a MSBuild script that you had to check-in into TFS. This script was responsible for calling VB6 on the command line and generate your VB executables. I just felt like something was left unexplored here, and I wanted to provide a little bit more sophisticated alternative for building your VB6 apps. So let’s customize the build workflow for the sake of our VB6 projects!

This post is also a tutorial for editing a TFS 2012 build process template!

The plan

We will modify the Default Template in order to make it compatible with VB6 projects. In the Solutions to Build process parameter, we want be pass a list of .NET solutions (.sln files), but also .vbp files. We will rely on some external activities to invoke VB6 compilation commands.

We’ll try to support the Ouputs clean option of regular builds, because it is useful for continuous Integration.

Set up your environment

We’ll be using the latest Community Build Extensions, you may download them and place the appropriate binaries in the custom assemblies folder that you have configured in your controller. If you don’t know about what I’m talking about, check out my A small guide to edit your TFS 2012 build templates and I hope you’ll be sorted.

Duplicate the Default Template, rename it something like DefaultTemplateVb6. If you want to be able to edit both files at once, follow this procedure I wrote that will avoid unwanted compile errors.

Build process changes

Open your build process template and go to Run On Agent => Try Compile, Test, and Associate Changesets and Work Items. Now look for the Try to Compile the Project TryCatch activity, you should double-click on its icon to “focus” the Workflow edition on this subcontent – it’s just more comfortable.

We are here at the heart of the loop that calls MSBuild on every project to build. We will need to store the result of the last VB6 compilation. For this, we need a variable in the scope of the Compile the Project block. Add a vbReturnCode variable as follows:


We want to run MSBuild on regular projects, and launch VB6 on .vbp projects. So we’ll add an If activity in order to filter VB6 projects. Insert it just before the Run MSBuild for Project activity.

Add a Sequence in the Then block, and drag the existing MSBuild activity in the Else block. Actually, recreate following piece of workflow:


Remember to add the custom activities in the toolbox in order to use them in the workflow (right-click and Choose Items…).


  • If “If Project Is VB6”
    • Condition: localProject.EndsWith(“.vbp”, StringComparison.InvariantCultureIgnoreCase) Or localProject.EndsWith(“.vbg”, StringComparison.InvariantCultureIgnoreCase)
    • Then
      • Sequence “Process VB6 Project”
        • VB6
          • see picture below for parameters
        • If “If VB Return Code” (see picture below)
          • Condition : vbReturnCode = 0
          • Then
            • SetBuildProperties “Set Compilation Status to Suceeded”
              • PropertiesToSet : CompilationStatus
              • CompilationStatus : BuildPhaseStatus.Suceeded
          • Else
            • SetBuildProperties “Set Compilation Status to Failed”
              • PropertiesToSet : CompilationStatus
              • CompilationStatus : BuildPhaseStatus.Failed
    • Else
      • move there the existing MSBuild activity


Now here is the content of the If block:


And the VB6 activity parameters:



Check-in the whole thing and check it compiles on a single solution.

Cleaning projects

In order to support the partial clean of builds (Clean Workspace is set to Output), we have to filter the VB6 projects from the process. VB6 executables are produced directly in the BinariesFolder, which is emptied automatically. The following actions will prevent nasty errors when cleaning outputs (because our .vbp aren’t MSBuild files).

First, navigate to Run On Agent => Initialize Workspace, look for the “Clean Project” Sequence, then encapsulate the If File.Exists(Project) with the same filter as we did previously:



The full condition text is:

Not localBuildProjectItem.EndsWith(".vbp", StringComparison.InvariantCultureIgnoreCase) And Not localBuildProjectItem.EndsWith(".vbg", StringComparison.InvariantCultureIgnoreCase)


Finally, set the process parameter Solution Specific Build Outputs to True to avoid a big mess in the Drop folder. And voilà, you can now mix and build regular solutions or VB6 projects with the same build definition! Smile


Above is what I obtain in the Drop folder, enjoy your builds!

Conflicts when editing multiple build process templates with TFS 2012

If you followed my small guide, or anyway if you have multiple build process templates in a single solution, you may have encountered this error when compiling with Visual Studio:


“obj\Debug\TfsBuild_Process_BeforeInitializeComponentHelper.txt” was specified more than once in the “Resources” parameter. Duplicate items are not supported by the “Resources” parameter.


This is actually easy to understand. When you add a Xaml workflow file into your solution, Visual Studio sets its properties to be part of the compilation:


Visual Studio generates code and resources out of the Xaml language file, and each Xaml process template has its own class name and namespace (full name). The problem is that generally, TFS build process templates file have the same full class names. The consequence is that there are collisions. Generally, this class name is TfsBuild.Process (hence the name of the TfsBuild_Process_BeforeInitializeComponentHelper.txt resource), because we often duplicate existing build templates.


I’ve seen colleagues changing the Build Action setting it to None:


But you loose the controls issued by the compile process. There is something better do to, simply rename the class or namespace inside the Xaml file:

  • Open the file with the Xml Editor (right-clic => Open With… => XML (Text) Editor)
  • Look for the x:Class attribute on the first line


  • Rename the namespace to something meaningful to you (ex:TfsBuild.Process => TfsBuildUpgrade.Process)
  • On the same line there is another mention of the namespace, rename it there two:


Hopefully, it should now compile fine:

1>------ Rebuild All started: Project: BuildProcessEditing, Configuration: Debug Any CPU ------

1> BuildProcessEditing -> C:\Sources\MyBuildProcess\BuildProcessEditing\bin\Debug\BuildProcessEditing.dll

========== Rebuild All: 1 succeeded, 0 failed, 0 skipped ==========

And the build should still work Winking smile  This way you can put all your build files in the same solution, this is more comfortable! I consider good practice to fix every new build process template I create. Hope this helps!

Small guide to using VB6 with TFS 2012

I’ve been recently working for a client with lots of VB6 projects. The fun part is that we wanted to migrate from VSS to TFS 2012. Although VB6 no longer supported by Microsoft, there is no reason why TFS would not work for VB6, you can host Java in TFS right? If you have VB6 projects and want to plug them into TFS and have them built in a continuous integration perspective, then I hope this small guide will be helping you.

What to install

First, check you have VB6 with SP6, and the mouse wheel fix as well, I won’t spend more time here since you’re already using it.

You’ll need to install the Visual Studio 2012 edition of your choice, with the latest Update (at this time it is Update 2). Then the famous TFS Power Tools, which add nice check-in policies (and more). Finally, you’ll need the MSSCCI Provider for Visual Studio 2012, 32-bit version or 64-bit version.

After installation check: verify that inside VS 2012 => TOOLS => Options => Source Control, the Plug-in Selection value is set as follows:


If you still have Visual SourceSafe around

Warning, the MSSCCI provider has rerouted VB6 source control interactions to the Team Foundation Server source controller. To connect back to VSS, you need to perform some registry operations. Fortunately, small utilities will do that for you, by listing all MSSCCI providers available on your machine, and allowing you to choose which one is active. So you’ll be able to switch back and forth easily from VSS to TFS. This one worked for me => SCPSelector.exe.


SCPSelector in action


Unwanted prettify options for VB6

Visual Studio 2012 *doesn’t know* about VB6, it knows about VB.NET!

When you are merging files, you don’t want VS 2012 to make assumptions regarding your syntax, and even less *modify your VB6 code*. Make sure you uncheck those options in the TOOLS => Options => Text Editor => Basic => VB Specific menu:


How to map the sources

Local workspaces are great, but the MSSCCI provider is not happy with them. You’ll have to use the traditional server workspaces. Well, it’s not a big deal.

Ok, so let’s create server workspaces and map our VB6 sources from TFS. Now, I want to develop with VB6 but when I open the project with VB6, I get asked to add my project to TFS, doh! Actually, I’d want me and my users to open up any VB6 project as smoothly as possible. To achieve that, you have to edit the MSSCCPRJ.SCC files (or create them), they contain the necessary MSSCCI data to connect to the proper source controller. The bad news is that you can’t share those files! They are specific to your login and your workspace so adding them into the source controller is useless! Sad smile


VbTfsBinding will do the work for you

I wrote a small utility that will generate all those files for you. Copy it at the root of your workspace and it will generate a MSSCCPRJ.SCC file for every .vbp file in your workspace. Now you can just open any VB6 project in your workspace, you should not be annoyed by any configuration message box. 

Here you are with the source code, and the executable (use it at your own risks of course).

So here’s a few tips for using VbTfsBinding:

  • Configure the config file with your TFS Project Collection URL
  • To share the tool, you can include it in your TFS source controller, in a subfolder at the root of your workspace (or branch), and add a .cmd file that changes the current directory and launches “VbTfsBinding.exe /force”
  • The /force flag will overwrite read-only files, in case someone checks-in a MSSCCPRJ.SCC file

Ok, now the basic source control feature of TFS are usable directly from VB6, but I would advise to always check-in from VS 2012. This just allows you to make sure you don’t forget files in your changesets, you can check all your pending changes with a glance, and I feel more secure that way.

Building your VB6 executables

Now the fun part. Our goal is to call VB6 on as many VB6 projects we want to build. The command line is:

Vb6.exe /m Projects.vbg /out Projects.vblog

Where projects.vbg is project group file which contains the list of projects we want to build.

Let’s follow the path of the Lazy. Let’s use a simple MsBuild .proj file to encapsulate the VB6 compilation logic, and rely on the default DefaultTemplate.xaml of TFS do the rest.

First, prepare your VB6 group file and check it next to your projects in the source controller (paths are relative). You can check the compilation with VB6 on your machine.

Then, add the following MsBuild file next to the .proj file, let’s call it Projects.proj:

<Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="">

    <VBPath>C:\Program Files\Microsoft Visual Studio\VB98\VB6.exe</VBPath>
    <VBPath Condition="Exists('C:\Program Files (x86)\Microsoft Visual Studio\VB98\VB6.exe')">C:\Program Files (x86)\Microsoft Visual Studio\VB98\VB6.exe</VBPath>

  <Target Name="Build">
    <Exec ContinueOnError="True" Command='"$(VBPath)" /m Projects.vbg /out Projects.vblog"'>
      <Output TaskParameter="ExitCode" PropertyName="VBExitCode"/>

    <Message Text="---- Start VB Output ------------------------------------------------------------" />
    <Exec ContinueOnError="True" Command="type Projects.vblog" />
    <Message Text="---- End of VB Output -----------------------------------------------------------" />
    <Error Condition="'$(VBExitCode)' != '0'" Text="Fatal error because of VB exit code." />
      <VBBinaries Include="**\*.exe" />
    <Copy Condition="'$(OutDir)' != ''" SourceFiles="@(VBBinaries)" DestinationFolder="$(OutDir)" />
  <Target Name="Clean">
      <VBBinaries Include="**\*.exe" />
    <Delete Files="Projects.vblog" />
    <Delete Files="@(VBBinaries)" />

Now, create a new build definition, based on the Detault Template, make sure you have a Drop folder location, then the process parameters as follows (transposing the .proj file path to yours):


The nice thing is that this build is actually pretty standard: it is a build workflow that calls MsBuild for compilation, so there is naturally an MsBuild log file in the build Summary:


You’ll find your VB6 applications in the Drop Folder:


It’s up to you to set this build as a continuous integration build, or to schedule it, you know, it’s just a TFS build Winking smile 

Do you feel any better than with VSS? Enjoy! Smile

A small guide to editing your TFS 2012 build templates

TFS 2010 and later versions use Workflows (WF) at the core of their mechanics. Workflows are intuitive to edit, and powerful as any high level scripting language can be. However, in the context of Team Foundation Build, they need a particular set up with Visual Studio if you want a smooth editing experience. This post will hopefully help you in setting up a cohesive environment in order to edit your build process templates. This is just a proposal, this is the way I prefer setting it up, feel free to keep whatever you want here.

Team Project organization

I advise to use a “test” Team Project and edit your templates there. Why?

  • Test your build in the test Team Project and when it’s operational, copy the file(s) into your production Team Project. You don’t mess with a renamed or temporary template , nor produce unwanted changeset noise in the production Team Project. Very simple.
  • Because you may need to use another build controller, but we’ll discuss that later.

So we’ll need a Visual Studio solution and project in order to edit the Xaml template properly. Where shall we store that Solution?

Solution setup

Answer: not far from the build process templates : in a subfolder!


All you need actually is a regular class library project, target Framework 4.5 for TFS 2012 builds:


Notice there is no Xaml file at the project level, this is because they are added into the project « As Links », directly from the parent folder :


You can distinguish linked files from other in your projects because a of the tiny arrow on their file icon:


When I checkout the the Xaml template file, it is checked-out from its original location, this is fine.

We want to use a project for multiple reasons :

  • Compiling will help us to removing errors in the workflow file and it checks the references
  • It is necessary if you have custom assemblies, and frankly, you *will* have some at some stage. If you don’t go grab the latest Community TFS Build Extensions and you’ll have *very* helpful activities for your builds
  • We want “drag and drop” editing, if we have custom assemblies, this is the best way to do it

In order to compile nicely, you’ll need to add several references, this is where it gets not that obvious, nearly tedious, but you’ll need to do this only once in your life, so that’s ok 😉

References setup


Some of the are not easy to locate, here they are:

  • Microsoft.TeamFoundation.TestImpact.BuildIntegration : %ProgramFiles%\Microsoft Visual Studio 11.0\Common7\IDE\PrivateAssemblies\Microsoft.TeamFoundation.TestImpact.BuildIntegration.dll
  • Microsoft.TeamFoundation.TestImpact.Client : %windir%\assembly\GAC_MSIL\Microsoft.TeamFoundation.TestImpact.Client\\Microsoft.TeamFoundation.TestImpact.Client.dll (this is bad, we should *never* *ever* have to reference something in the GAC, hope this will be solved in next version)

Now you can edit the worfklow and compile it.


The development cycle of builds (aka the everyday life of the build master) consists in:

  • Check-out the workflow template file
  • Modify the workflow file
  • Check-in the worfklow file
  • Launch a build that you have set up with *this* workflow file

Adding custom assemblies

We are now set up for throwing in some custom assemblies. First, add the assemblies (and their associated .pdb) in an “Assemblies” folder, create it if necessary, just under the BuildProcessTemplates folder. What is important here is that this folder is configured in the properties of your Team Project build controller (Builds -> Actions -> Manage Build controllers…)


I’ve added my custom activities assemblies and their dependencies, and the Community TFS Build Extensions assemblies:


Simply add the assemblies that contain the activities you’ll need as regular references in your class library project. For a starter you’ll want TfsBuildExtensions.Activities.dll. I also added my custom activities library (no need to reference the dependencies).

The need for a separate build controller

If you are serious with builds (if you have many production builds you can’t afford to interrupt for with your developments), you’ll want a separate build controller. It is easy to deploy.

When you check-in some assemblies in the custom assemblies folder, the build controller restarts, and cancels the builds in progress. This is where it gets handy to have a separate controller for testing purposes. And be aware of your production build deployment timing.

NB : checking-in the Xaml files does not restart the controller.

Toolbox setup

In order to drag and drop custom activities from the Toolbox, you can do the following:

  • Create a new Tab in the Toolbox
  • “Choose items” and browse up to the assembly that contains the activities you want
  • They now appear in the Toolbox, you can drag them into your build workflows

Without the project references to the corresponding assemblies, drag and drop would not work…

Finally, all is set up, you should be able to edit your builds the easy way. Enjoy! Smile

Visual Studio 2012 makes life much easier for Code Analysis

It may look like it is a small detail, but for me it makes a big change: you can now launch Code Analysis at Solution level!


That means only *one* Code Analysis action before checking-in, instead of having to remember which projects have been touched in your solution and launch the analysis separately for each project (as we did VS 2010)!

I’ll take the occasion to talk a bit about Code Analysis configuration.

Per project rule sets

First let me remind that rule sets are configurable in the project properties of each project, and can vary by configuration (Debug, Release, etc.).


I won’t advise here how to organize *your* rules here, whether it is best or not to have different rule sets for your projects or one rule set “to rule them all” (sorry, couldn’t help). It just depends on what works best for you and your teams. Here’s just an example of what can be done:


Sharing rule sets

You can easily make the project point to a rule set file stored on a network share. This is something you really want if you have many projects and solutions in your company.

Another great way to share rule sets is the source controller itself, the path to the rule set is stored in a relative form in the project file:


If you have custom house rules, you can ship them along with your rule sets files. You’ll have to edit the rule set file and add the following Xml node:


Sharing the rules via the source controller (rules are in the project stucutre) works great for isolated projects and distributed contexts. But if you have a big code base you have to place your rule files somewhere at the top of your folder hierarchy, or add a mapping entry in all your workspaces. Moreover, it seems you may have trouble using custom dll rules because the RuleHintPaths are absolute and not relative.

The network approach looks easier, especially with custom rules, but you may encounter nasty load file problems, I’m still trying to solve that kind of problem for one of my clients, some computers would just not manage to execute the rules (I’ll post here when I find the solution).

Code Analysis for the build

The build server will also run Code Analysis, so you have to make sure your rule sets are available to the build process (workspaces, network paths, etc.). Generally, they will. This is the easy part, you have multiple options:


  • AsConfigured: will obey what you have set up each project Code Analysis settings (see the Enable Code Analysis on build option in the screenshot above)
  • Always: will force Code Analysis for every project, even if the aforementioned option is not checked
  • Never: will force CA not to run…

It is simple and easy, there is no need to create a new project configuration named “Debug with CA” and check the “Enable Code Analysis on build” option in every project, then configure the build to use this configuration. No, we don’t need to do that! Smile

I’d be curious to know how your share your custom rules in your company, feel free to drop a comment Smile

Batch copying build definitions in TFS 2010

[get the source code of this sample utility]

This post is part my “merging team projects series” but can be considered independently. I’ll explain and publish a code template that will help you to batch copy build definitions from a team project to another. This is all possible thanks to the TFS API.

More than just raw copying

I’m not the first to blog about this, and you can find various small pieces of code here, or here. Why am I bothering then ? Because what I intend to do is more than just raw copying, we need also to transpose Workspaces, change build templates location, edit build process parameters, edit build templates in the source controller and check them in…


  • Regex selection of build names on command line
    • I recommend using a batch file with every possible filters in there are many
  • 3 log files in append mode (support for batch file with many calls)
  • Workspace transformations (customizable in the code)
    • Easy to report paths that are non-standard according to your own rules
  • Build process parameters transformations (customizable in the code)
  • Copy, transform and check-in files in the source controller (customizable in the code)


  • Single TFS Server (cannot migrate to another server)
  • Need to customize the C# code to get exactly what you want


My starting point what Jim Lamb’s piece of code.

You’ll need to reference a few classic TFS assemblies, including a private one: Microsoft.TeamFoundation.Build.Workflow.dll, you’ll find it in C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies. Note that this reference will force you to change the targeted framework to .NET Framework 4.0 instead of the client profile, but that doesn’t matter too much for that kind of utility.


I’ve also included an easy to use class for processing command line parameters C#/.NET Command Line Arguments Parser, many thanks to GriffonRL.

I wrote a small utility class to deal with workspaces, you’ll find it in this project.

Code Highlights

A few pieces of code that can be of interest:

// clone the build into the target team project
IBuildDefinition newDefinition = Utilities.CloneBuildDefinition(_bs, buildDef, targetTeamProject);

As I mentioned earlier, this basically calls Jim Lamb’s piece of code in order to get the build object properly duplicated.

// accessing the build process parameters stored in TFS in a serialized format
IDictionary<String, Object> processParameters = WorkflowHelpers.DeserializeProcessParameters(buildDef.ProcessParameters);

Continue reading