Sonar build activity for TFS updated

I’ve just committed a few evolutions to the Sonar activity in the Community TFS Build Extensions project (see part1 and part2 for how to integrate Sonar with TFS). It just makes it easier to generate the file by adding new variables. I’ve added as well a new way to add or override Sonar properties directly from the build definition.

New variables

The new variables are :

  • BINARIES_DIRECTORY_PATH: the binaries folder of the TFS build (as defined in the build process template)
  • BINARIES_DIRECTORY_PATH_SLASH: same path with slashes instead of backslashes (for Sonar)

Warning: now, you must pass a new activity parameter BinariesDirectory from the build process template to your Sonar activity instance of the build process template.

This will hopefully fix a problem with the assemblies path when the solution was not at the root of your workspace, the path is now absolute and should always work whatever your workspace configuration:


Passing values from the build definition up to Sonar

A new SonarProperties activity property allows you to pass a list a key-value pairs that will be added (or overwritten) to the resulting file. It is easy now to pass custom build parameters directly to Sonar for a specific solution to build.


I find it easier to edit this kind of properties list with a String array in the build definition (can be edited as a text block), so I declare a String[] in the build process template arguments list. As the activity needs a StringList, you can convert the array using the following statement:

[If(SonarProperties Is Nothing, Nothing, New StringList(SonarProperties))]

New unit tests

I’ve also added several unit tests for the Sonar activity as well. I used the techniques I talked about in my previous post about unit testing and builds. Feel free to have look, I’d be glad to have your feedback.

Hope this helps! Smile

How to integrate Sonar with TFS (part 2)

In a recent post of mine, I introduced quickly Sonar, and how we can integrate it with TFS. I gave a few tips for its installation. Now it is time to dive in the build process and add the logic we need to make TFS builds use Sonar conveniently.

Are your really sure?

Using Sonar as I’m proposing will deport some processing from TFS to Sonar, which means TFS loses control over a certain things: unit tests are no longer launched by TFS. You *have* to use Gallio Test runner because other test reports format are not *yet* supported. The best bet is then to let Sonar launch tests through Gallio, which may have side effects!

Code analysis can no longer be configured from your projects settings, nor from the build settings (I mean you don’t want to launch it twice), so deactivate it in your builds (pick the Never option), and let Sonar launch FxCop instead.

The tooling is not always up to date, MsTest with VS 2012 was not working when it came out, which means you can be stuck if you upgrade too early. We are in the open source world, we have no guarantee it will work seamlessly with other technologies, unless you contract for a commercial support with SonarSource.

You’re still reading? Ok, you’re a pioneer now, you’ll need a good army knife and some will to make things work in your environment. Hopefully you won’t regret because it will pay off.

Build server tooling

Some tools and scripts need to be deployed onto every build server. We aim at xcopy deployment, I advise to opt for a similar folder structure that you copy from server to server. I pompously named the parent folder “Soner.Net”, and here is its contents:


I cheated a bit with some products, took their installation from the “Program Files” folder on my local machine and copied them in this structure. It just worked for me.

I prefer having a private version of Java running then Sonar analysis. For this, you can just customize the sonar-runner.cmd file and update the PATH, and JAVA_HOME to point to the subfolder where you uncompressed your JDK (not JRE). Be aware of using absolute paths here!

The sonar properties

All your project parameters for the sonar analysis are in a file. You should place this file next to the solution file you want to analyze. I see two options for managing those files:

  1. Create a file for each .NET solution and add it in the source controller
    • Sounds reasonable if you don’t have too many projects
  2. Generate them automatically!
    • But this requires some build customization

The good news is that I’ve written a build Workflow Sonar activity that will generate the properties for you, or at least, help you to generate them. It is very simple, it takes a template file and replaces a few values for you. The path to the template file must be configured in the build process workflow.

Here is the template I’ve set up for using with TFS, a sample sonar-properties.template file:

# Project identification

# Info required for Sonar

# If multiple solutions in folder, use the following line to disambiguate


sonar.fxcop.installDirectory=D:/Sonar.Net/Microsoft Fxcop 10.0

sonar.gallio.filter=exclude Type:/\.Integration\./


As you can see, there are values that will be replaced at run time. You can see their description on the activity documentation page. The trick with TFS builds is that the output folder for projects is forced to a “Binaries” folder outside the scope of your sources! This template assumes it is running from a TFS build.

It should work locally

To test all these tools, fortunately, you don’t have to run builds. First, create such a properties file based on your values for a project you want to test. Make sure you comment the sonar.dotnet.assemblies and sonar.dotnet.test.assemblies properties since Sonar C# Ecosystem guess them right when TFS is not overriding the output paths. Then, in a command prompt, after having compiled the project, move to your project folder. From there, invoke the file. It should work.

Once this works for you, you are close to make it running into TFS builds, because all we need now is to launch this command line from our builds, and eventually generate the properties dynamically.

Modifying your build template

For this, you need to get the latest Community TFS Build Extensions, and deploy them into your build controller custom assemblies source control folder. See my guide here if you’re not at ease with setting up the solution for editing build templates. You may clone the DefaultTemplate.xaml and start editing it. Once your ready to inject the Sonar activity in your build template, locate the Run On Agent => Try Compile, Test, and Associate Changesets and Work Items activity, it contains the Sequence as illustrated below. You should be able to drag the “Sonar” activity as indicated.


A nice and very simple idea is to add a boolean workflow Argument named “RunSonarAnalysis”.


Then encapsulate the Sonar activity into an If activity.


If you add the proper Metadata (locate the Metadata Argument, edit it, and add the RunSonarAnalysis argument in the list) for this parameter, you’ll be able to control the Sonar execution from your build definition! That is the start of a real integration.

Finally, edit the Sonar activity properties, and you’re all set!



Now you can decide to run Sonar from your build definitions!

You may add to your workflow custom parameters (with Metadata), and pass them directly to the Sonar activity. This would allow you to pass values such as “active” or “skip” to enable or disable the plugins of your choice, on a per project basis.

It is not as complicated as it sounds to run Sonar from TFS builds. There are things that can be done better, so stay tuned for future improvements with this activity!

Testing your TFS 2012 build workflow activities

I recently posted about setting up a solution for editing build workflows (with TFS 2012), I’m going to write directly today about testing your custom activities, because I think good guides about writing them are already out there, and there (even though they talk about TFS 2010, the logic is quite the same). Today, we’ll have a look at how I we test activities without actually starting a real build.

As I’m currently writing an activity for launching Sonar during TFS builds (you can learn about Sonar with TFS here). Although I’m not finished writing the whole stuff, I can still share a few tips I learned for testing activities.

Classic testing approach

The approach is very classic : set up a test context, exercise the test, and check the results (and clean up).

The tests rely on the ability of given by Workflow Foundation to host one’s own workflow engine. So we’ll need mainly two things :

  • Create the right workflow context to emulate the behavior of build (only what’s needed for the test of course)
  • Create an instance of the activity, passing necessary parameters and stubs to fulfill the test

During our test, the workflow engine will run the activity. The activity will use objects and value from its parameters and interact with the context. Then we can add checks about what has actually happened. Ready?

Workflow activities testing tips

Creating the activity and executing it

You can instantiate a Worklow activity with a classic new statement. Literal parameters can be passed in the constructor or affected (literals here are value types and Strings). All other objects must be passed in a Dictionnary<String, Object> structure at invocation time.

  1. // constants (literals)
  2. var activity = new Sonar
  3. {
  4.     // this is a String (a literal)
  5.     SonarRunnerPath = SonarRunnerPath,
  6.     // this is a boolean (a literal as well)
  7.     FailBuildOnError = FailBuildOnError,
  8.     GeneratePropertiesIfMissing = GeneratePropertiesIfMissing,
  9.     SonarPropertiesTemplatePath = TemplatePropertiesPath,
  10.     FailBuildOnAlert = FailBuildOnAlert,
  11.     // StringList is not a workflow literal
  12.     // the following line will cause an exception at run time
  13.     ProjectsToAnalyze = new StringList("dummy.sln")
  14. };



Here all values are booleans or strings, except one StringList that will cause an error at run time (so we must remove it). Here’s how to invoke the activity (actually a workflow composed of one activity) and pass the StringList as an argument:

  1. // object variables
  2. var parameters = new Dictionary<string, object>
  3. {
  4.     { "ProjectsToAnalyze", new StringList("dummy.sln") }
  5. };
  7. // the workflow invoker, our workflow is composed of only one activity!
  8. WorkflowInvoker invoker = new WorkflowInvoker(activity);
  9. // executes the activity
  10. invoker.Invoke(parameters);


Tracking build messages

You may want to check what your activity is logging, you know, when you call the TrackBuildMessage method or use the WriteBuildMessage (or Warning or Error) activity. To do this you need to set up a recorder, or more exactly TrackingParticipant. Here is a TrackingParticipant derived class that is specialized to recording build messages:

A build message tracking class
  1. namespace MyBuilds.BuildProcess.Tests
  2. {
  3.     using System;
  4.     using System.Collections.Generic;
  5.     using System.Linq;
  6.     using System.Text;
  7.     using System.Activities.Tracking;
  8.     using Microsoft.TeamFoundation.Build.Workflow.Tracking;
  9.     using Microsoft.TeamFoundation.Build.Workflow.Activities;
  11.     /// <summary>
  12.     /// BuildMessageTrackingParticipant that logs build messages during build workflow activities
  13.     /// </summary>
  14.     public class BuildMessageTrackingParticipant : TrackingParticipant
  15.     {
  16.         private StringBuilder _sb = new StringBuilder(4096);
  18.         public override string ToString()
  19.         {
  20.             return _sb.ToString();
  21.         }
  22.         protected override void Track(TrackingRecord record, TimeSpan timeout)
  23.         {
  24.             var buildMessage = record as BuildInformationRecord<BuildMessage>;
  25.             if (buildMessage != null && buildMessage.Value != null)
  26.             {
  27.                 _sb.AppendLine(buildMessage.Value.Message);
  28.             }
  30.             var buildWarning = record as BuildInformationRecord<BuildWarning>;
  31.             if (buildWarning != null && buildWarning.Value != null)
  32.             {
  33.                 _sb.AppendLine(buildWarning.Value.Message);
  34.             }
  36.             var buildError = record as BuildInformationRecord<BuildError>;
  37.             if (buildError != null && buildError.Value != null)
  38.             {
  39.                 _sb.AppendLine(buildError.Value.Message);
  40.             }
  41.         }
  42.     }
  43. }

To use it, all you need is to instantiate it and pass the instance to the workflow invoker:
  1. var workflowLogger = new BuildMessageTrackingParticipant();
  2. invoker.Extensions.Add(workflowLogger);

After the test, you can get the build “log” by calling the .ToString() method on the worfklowLogger instance.

Setting up a custom IBuildDetail instance

During builds, activities regularly get the “build details”, an IBuildDetail instance that contains lots of useful contextual data. This instance comes from the worfklow context, and activities get it by using code that looks like the following:
  1. IBuildDetail build = this.ActivityContext.GetExtension<IBuildDetail>();

Thankfully, it is an interface, so it is very easy to stub. I like to use the Moq mocking framework because it is very easy (yet not very powerful, but it is perfect for classic needs). Now we need to create a stub out of the IBuildDetail interface, customizing it for your needs, and to inject it in the workflow “context”. I’ll actually assemble multiple stubs together because I also need to set up the name of the build definition for my activity (yes, the activity uses the current build definition name!):
  1. // the build definition stub that holds its name
  2. var buildDefinition = new Mock<IBuildDefinition>();
  3. buildDefinition.Setup(d => d.Name).Returns("My Dummy Build");
  5. // a build detail stub with the buildnumber, build definition stub and log location
  6. var buildDetail = new Mock<IBuildDetail>();
  7. buildDetail.Setup(b => b.BuildNumber).Returns("My Dummy Build_20130612.4");
  8. buildDetail.Setup(b => b.BuildDefinition).Returns(buildDefinition.Object);
  9. buildDetail.Setup(b => b.LogLocation).Returns(Path.Combine(TestContext.TestDeploymentDir, "build.log"));
  11. // pass the stub to the invoker extensions
  12. invoker.Extensions.Add(BuildDetail.Object);

Now, the activity “thinks” it is using real “build details” from the build, but during tests we are using a fake object, with just the necessary values for the test to pass. So this is actually a pure classic stubbing scenario, no more.

Passing TFS complex objects

Unfortunately, not all of the objects and classes we need in build activities are interfaces, or pure virtual classes. Those are easy to stub. In the case of objects such as a Workspace, a VersionControlServer, a WorkItemStore, or a WorkItemType, you have to use more powerful stubbing frameworks such as Microsoft Fakes or Typemock.
Let’s use Fakes since it is available the Visual Studio Premium edition.
First, locate the assembly that our target type belongs to. The Workspace class belongs to the Microsoft.TeamFoundation.VersionControl.Client assembly. Right-click it in the References of your project and add a Fakes assembly:
Fakes processes all types and members of this assembly and dynamically generates a new reference which contain “empty” objects, with all overridable properties and members, and compatible with the original types. All types are prefixed by Shim or Stub, and methods include types of their signatures in their names. Here is an example that illustrates how to set up a Workspace “Shim”. When we call the GetLocalItemForServerItem method, it will return the value we want, that is LocalSolutionPath:
  1. // our workspace stub
  2. ShimWorkspace workpace = new ShimWorkspace()
  3. {
  4.     // we override String GetLocalItemForServerItem()
  5.     // and have it return a value of our own for the test
  6.     GetLocalItemForServerItemString = (s) => LocalSolutionPath
  7. };

To pass the actual Workspace compatible object to our activity as a parameter, use its .Instance property. Since it is not a workflow literal, let’s use the Dictionnary like we did before:
  1. // object variables
  2. var parameters = new Dictionary<string, object>
  3. {
  4.     { "BuildWorkspace", workpace.Instance },
  5.     { "ProjectsToAnalyze", new StringList("dummy.sln") }
  6. };


Ok, we covered a few techniques that should allow you to test most activities now. When I’m satisfied with the tests I’m currently writing, I’ll publish them in the Community TFS Build Extensions project. So keep an eye on them if you’re are interested for a full running piece of code, sorry to make you wait!

How to integrate Sonar with TFS (part 1)

Hi! Today, I’ll briefly introduce Sonar (recently renamed SonarQube) and explain a few tips on how to deploy it on Windows, having in mind to integrate it with TFS just after.

Sonar in a nutshell

Sonar is mainly a Web portal that stores everything about your builds and helps you navigate into all this data. Quality metrics are gathered by plugins of various tools (that may not come with Sonar), into a central database. The Web portal is composed of customizable dashboards, made out of customizable widgets, which can display data in various forms, with the ability to easily compare with previous builds, or see the progression through the last days or month. A drill down logic starting from any metric (such as Line of Code, violations, unit tests and coverage, etc.) will allow you to pinpoint the projects, files, and lines of code that are at the origin of their values. Various plugins (there are commercial ones) are available: they can group projects and aggregate their data, or see stats per developer for example. You can define quality profiles and select the rules that you want to apply to your projects (each rule is tied to a plugin), and create alerts when those rules obey certain conditions (too many violations, or coverage too low for the simplest).


Shot taken from

Why Sonar and TFS?

Because Sonar is a great complement to TFS. It is not always easy to get the exact report we want: you’ll find reporting services and Excel reports which have to be set up with date ranges, and solutions filters. So you may have spent quite some time to configure a SharePoint dashboard. You can’t easily set thresholds that fail your builds according to some various metrics conditions. I mean, if all of this is possible because TFS is highly customizable, it is not all centralized in a single fully featured UI, and requires to use various products or technologies. Builds do not compare to each other (only the duration, and the GUI is fixed). While Excel shines at connecting to the TFS warehouse or cube, you need to be an Excel dude in order to navigate, slice, aggregate, compare data about build results. Third party tools don’t store their data into the build reports in a structured way, so you won’t get their metrics directly in the cube. While all this is possible with, really, it is not there as easily as we would want, and that is why Sonar is becoming so popular in the .NET world (and not especially with TFS).

Keep in mind that TFS is about so much more than Sonar. TFS links Work Items to code, allowing you to get an insight of real semantics in your projects (bugs and requests influence for example). Sonar focuses *only* onto the quality of your code, instantly and over time.

So we all know that Sonar is a Java application so it is evil by essence (just kidding Winking smile), but it proves to be useful even in the .NET world, thanks to the hard work of a few pioneers to write java plugins that would launch our favorite everyday tools (FxCop, StyleCop, Gendarme) and tests frameworks (with Gallio and various coverage technologies), there it is, waiting for us.

The plan to integrate Sonar

Integrating Sonar means that our TFS Builds will launch a Sonar analysis on our projects.


For simplicity’s sake, I’ve not represented TFS components such as build controllers, agents, etc. What is important here, is that the TFS build calls something named “Sonar runner”. This Sonar runner launches a JVM with a bootstrap that launches each plugin you have configured in your Sonar server. Each Sonar plugin then launches the appropriate native tools, gets their results and publishes them into the Sonar server. The data is stored in the Sonar database.

Installing Sonar

I’m not actually going to guide you throughout the whole installation. There is already a pretty good documentation for this, and I’m not the first to talk about Sonar under Windows, see also this install guide as well. What is sure is that you’ll need to install the SonarQube Server, the Sonar runner, and then the plugin suite named C# Ecosystem.

Nevertheless, I will give you a few tips and configuration blocks samples that will help you. Naturally, I installed Sonar against a SQL Server 2008 R2 database Smile, so create an empty database and configure the server this way:

sonar.jdbc.username:     <sql server user>
sonar.jdbc.password:     <password>

sonar.jdbc.url: jdbc:jtds:sqlserver://myserver;SelectMethod=Cursor;instance=SONARINSTANCE;databaseName=Sonar

# Optional properties
sonar.jdbc.driverClassName: net.sourceforge.jtds.jdbc.Driver

You’ll need a JDBC jTDS driver for using SQL Server, which is included in the Sonar server distribution (cool!), in the extensions\jdbc-driver\mssql folder. I’m not used to creating SQL Server security accounts. Since I always go integrated security, I find that managing passwords is prehistoric and unsecure practice, but I guess I have no choice.

The LDAP plugin works well, you can also get the groups your users belong to in the Active Directory.

Here is the configuration that I used my AD (spent a few hours to make it working, so I hope it will help):

# LDAP configuration LDAP false
sonar.authenticator.createUsers: true

ldap.url: ldap://
ldap.user.baseDn: ou=USERS,dc=mydomain,dc=com
ldap.user.request: (&(objectClass=user)(sAMAccountName={login})) 
ldap.user.realNameAttribute: cn
ldap.user.emailAttribute: mail
ldap.bindDn: CN=sonarsvcaccount,OU=SERVICES ACCOUNTS,DC=mydomain,DC=com  
ldap.bindPassword: sonarsvcpassword OU=GROUPS,dc=mydomain,dc=com (&(objectClass=group)(member={dn}))

It is horrible and terrible, I known, I could not avoid to put the sonar service account password in the configuration file, protect this file!

Finally, set up Sonar as a service of course (with the sonar service account aforementioned).

That’s all for today folks! Next post I’ll talk about all the build and analysis stuff!

Build your old VB6 projects with TFS 2012

or How to edit a TFS 2012 build process template

A few weeks ago, I blogged about using VB6 with TFS 2012. The build process I proposed was relying on a MSBuild script that you had to check-in into TFS. This script was responsible for calling VB6 on the command line and generate your VB executables. I just felt like something was left unexplored here, and I wanted to provide a little bit more sophisticated alternative for building your VB6 apps. So let’s customize the build workflow for the sake of our VB6 projects!

This post is also a tutorial for editing a TFS 2012 build process template!

The plan

We will modify the Default Template in order to make it compatible with VB6 projects. In the Solutions to Build process parameter, we want be pass a list of .NET solutions (.sln files), but also .vbp files. We will rely on some external activities to invoke VB6 compilation commands.

We’ll try to support the Ouputs clean option of regular builds, because it is useful for continuous Integration.

Set up your environment

We’ll be using the latest Community Build Extensions, you may download them and place the appropriate binaries in the custom assemblies folder that you have configured in your controller. If you don’t know about what I’m talking about, check out my A small guide to edit your TFS 2012 build templates and I hope you’ll be sorted.

Duplicate the Default Template, rename it something like DefaultTemplateVb6. If you want to be able to edit both files at once, follow this procedure I wrote that will avoid unwanted compile errors.

Build process changes

Open your build process template and go to Run On Agent => Try Compile, Test, and Associate Changesets and Work Items. Now look for the Try to Compile the Project TryCatch activity, you should double-click on its icon to “focus” the Workflow edition on this subcontent – it’s just more comfortable.

We are here at the heart of the loop that calls MSBuild on every project to build. We will need to store the result of the last VB6 compilation. For this, we need a variable in the scope of the Compile the Project block. Add a vbReturnCode variable as follows:


We want to run MSBuild on regular projects, and launch VB6 on .vbp projects. So we’ll add an If activity in order to filter VB6 projects. Insert it just before the Run MSBuild for Project activity.

Add a Sequence in the Then block, and drag the existing MSBuild activity in the Else block. Actually, recreate following piece of workflow:


Remember to add the custom activities in the toolbox in order to use them in the workflow (right-click and Choose Items…).


  • If “If Project Is VB6”
    • Condition: localProject.EndsWith(“.vbp”, StringComparison.InvariantCultureIgnoreCase) Or localProject.EndsWith(“.vbg”, StringComparison.InvariantCultureIgnoreCase)
    • Then
      • Sequence “Process VB6 Project”
        • VB6
          • see picture below for parameters
        • If “If VB Return Code” (see picture below)
          • Condition : vbReturnCode = 0
          • Then
            • SetBuildProperties “Set Compilation Status to Suceeded”
              • PropertiesToSet : CompilationStatus
              • CompilationStatus : BuildPhaseStatus.Suceeded
          • Else
            • SetBuildProperties “Set Compilation Status to Failed”
              • PropertiesToSet : CompilationStatus
              • CompilationStatus : BuildPhaseStatus.Failed
    • Else
      • move there the existing MSBuild activity


Now here is the content of the If block:


And the VB6 activity parameters:



Check-in the whole thing and check it compiles on a single solution.

Cleaning projects

In order to support the partial clean of builds (Clean Workspace is set to Output), we have to filter the VB6 projects from the process. VB6 executables are produced directly in the BinariesFolder, which is emptied automatically. The following actions will prevent nasty errors when cleaning outputs (because our .vbp aren’t MSBuild files).

First, navigate to Run On Agent => Initialize Workspace, look for the “Clean Project” Sequence, then encapsulate the If File.Exists(Project) with the same filter as we did previously:



The full condition text is:

Not localBuildProjectItem.EndsWith(".vbp", StringComparison.InvariantCultureIgnoreCase) And Not localBuildProjectItem.EndsWith(".vbg", StringComparison.InvariantCultureIgnoreCase)


Finally, set the process parameter Solution Specific Build Outputs to True to avoid a big mess in the Drop folder. And voilà, you can now mix and build regular solutions or VB6 projects with the same build definition! Smile


Above is what I obtain in the Drop folder, enjoy your builds!

Conflicts when editing multiple build process templates with TFS 2012

If you followed my small guide, or anyway if you have multiple build process templates in a single solution, you may have encountered this error when compiling with Visual Studio:


“obj\Debug\TfsBuild_Process_BeforeInitializeComponentHelper.txt” was specified more than once in the “Resources” parameter. Duplicate items are not supported by the “Resources” parameter.


This is actually easy to understand. When you add a Xaml workflow file into your solution, Visual Studio sets its properties to be part of the compilation:


Visual Studio generates code and resources out of the Xaml language file, and each Xaml process template has its own class name and namespace (full name). The problem is that generally, TFS build process templates file have the same full class names. The consequence is that there are collisions. Generally, this class name is TfsBuild.Process (hence the name of the TfsBuild_Process_BeforeInitializeComponentHelper.txt resource), because we often duplicate existing build templates.


I’ve seen colleagues changing the Build Action setting it to None:


But you loose the controls issued by the compile process. There is something better do to, simply rename the class or namespace inside the Xaml file:

  • Open the file with the Xml Editor (right-clic => Open With… => XML (Text) Editor)
  • Look for the x:Class attribute on the first line


  • Rename the namespace to something meaningful to you (ex:TfsBuild.Process => TfsBuildUpgrade.Process)
  • On the same line there is another mention of the namespace, rename it there two:


Hopefully, it should now compile fine:

1>------ Rebuild All started: Project: BuildProcessEditing, Configuration: Debug Any CPU ------

1> BuildProcessEditing -> C:\Sources\MyBuildProcess\BuildProcessEditing\bin\Debug\BuildProcessEditing.dll

========== Rebuild All: 1 succeeded, 0 failed, 0 skipped ==========

And the build should still work Winking smile  This way you can put all your build files in the same solution, this is more comfortable! I consider good practice to fix every new build process template I create. Hope this helps!

Small guide to using VB6 with TFS 2012

I’ve been recently working for a client with lots of VB6 projects. The fun part is that we wanted to migrate from VSS to TFS 2012. Although VB6 no longer supported by Microsoft, there is no reason why TFS would not work for VB6, you can host Java in TFS right? If you have VB6 projects and want to plug them into TFS and have them built in a continuous integration perspective, then I hope this small guide will be helping you.

What to install

First, check you have VB6 with SP6, and the mouse wheel fix as well, I won’t spend more time here since you’re already using it.

You’ll need to install the Visual Studio 2012 edition of your choice, with the latest Update (at this time it is Update 2). Then the famous TFS Power Tools, which add nice check-in policies (and more). Finally, you’ll need the MSSCCI Provider for Visual Studio 2012, 32-bit version or 64-bit version.

After installation check: verify that inside VS 2012 => TOOLS => Options => Source Control, the Plug-in Selection value is set as follows:


If you still have Visual SourceSafe around

Warning, the MSSCCI provider has rerouted VB6 source control interactions to the Team Foundation Server source controller. To connect back to VSS, you need to perform some registry operations. Fortunately, small utilities will do that for you, by listing all MSSCCI providers available on your machine, and allowing you to choose which one is active. So you’ll be able to switch back and forth easily from VSS to TFS. This one worked for me => SCPSelector.exe.


SCPSelector in action


Unwanted prettify options for VB6

Visual Studio 2012 *doesn’t know* about VB6, it knows about VB.NET!

When you are merging files, you don’t want VS 2012 to make assumptions regarding your syntax, and even less *modify your VB6 code*. Make sure you uncheck those options in the TOOLS => Options => Text Editor => Basic => VB Specific menu:


How to map the sources

Local workspaces are great, but the MSSCCI provider is not happy with them. You’ll have to use the traditional server workspaces. Well, it’s not a big deal.

Ok, so let’s create server workspaces and map our VB6 sources from TFS. Now, I want to develop with VB6 but when I open the project with VB6, I get asked to add my project to TFS, doh! Actually, I’d want me and my users to open up any VB6 project as smoothly as possible. To achieve that, you have to edit the MSSCCPRJ.SCC files (or create them), they contain the necessary MSSCCI data to connect to the proper source controller. The bad news is that you can’t share those files! They are specific to your login and your workspace so adding them into the source controller is useless! Sad smile


VbTfsBinding will do the work for you

I wrote a small utility that will generate all those files for you. Copy it at the root of your workspace and it will generate a MSSCCPRJ.SCC file for every .vbp file in your workspace. Now you can just open any VB6 project in your workspace, you should not be annoyed by any configuration message box. 

Here you are with the source code, and the executable (use it at your own risks of course).

So here’s a few tips for using VbTfsBinding:

  • Configure the config file with your TFS Project Collection URL
  • To share the tool, you can include it in your TFS source controller, in a subfolder at the root of your workspace (or branch), and add a .cmd file that changes the current directory and launches “VbTfsBinding.exe /force”
  • The /force flag will overwrite read-only files, in case someone checks-in a MSSCCPRJ.SCC file

Ok, now the basic source control feature of TFS are usable directly from VB6, but I would advise to always check-in from VS 2012. This just allows you to make sure you don’t forget files in your changesets, you can check all your pending changes with a glance, and I feel more secure that way.

Building your VB6 executables

Now the fun part. Our goal is to call VB6 on as many VB6 projects we want to build. The command line is:

Vb6.exe /m Projects.vbg /out Projects.vblog

Where projects.vbg is project group file which contains the list of projects we want to build.

Let’s follow the path of the Lazy. Let’s use a simple MsBuild .proj file to encapsulate the VB6 compilation logic, and rely on the default DefaultTemplate.xaml of TFS do the rest.

First, prepare your VB6 group file and check it next to your projects in the source controller (paths are relative). You can check the compilation with VB6 on your machine.

Then, add the following MsBuild file next to the .proj file, let’s call it Projects.proj:

<Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="">

    <VBPath>C:\Program Files\Microsoft Visual Studio\VB98\VB6.exe</VBPath>
    <VBPath Condition="Exists('C:\Program Files (x86)\Microsoft Visual Studio\VB98\VB6.exe')">C:\Program Files (x86)\Microsoft Visual Studio\VB98\VB6.exe</VBPath>

  <Target Name="Build">
    <Exec ContinueOnError="True" Command='"$(VBPath)" /m Projects.vbg /out Projects.vblog"'>
      <Output TaskParameter="ExitCode" PropertyName="VBExitCode"/>

    <Message Text="---- Start VB Output ------------------------------------------------------------" />
    <Exec ContinueOnError="True" Command="type Projects.vblog" />
    <Message Text="---- End of VB Output -----------------------------------------------------------" />
    <Error Condition="'$(VBExitCode)' != '0'" Text="Fatal error because of VB exit code." />
      <VBBinaries Include="**\*.exe" />
    <Copy Condition="'$(OutDir)' != ''" SourceFiles="@(VBBinaries)" DestinationFolder="$(OutDir)" />
  <Target Name="Clean">
      <VBBinaries Include="**\*.exe" />
    <Delete Files="Projects.vblog" />
    <Delete Files="@(VBBinaries)" />

Now, create a new build definition, based on the Detault Template, make sure you have a Drop folder location, then the process parameters as follows (transposing the .proj file path to yours):


The nice thing is that this build is actually pretty standard: it is a build workflow that calls MsBuild for compilation, so there is naturally an MsBuild log file in the build Summary:


You’ll find your VB6 applications in the Drop Folder:


It’s up to you to set this build as a continuous integration build, or to schedule it, you know, it’s just a TFS build Winking smile 

Do you feel any better than with VSS? Enjoy! Smile

Merging feature branches for QA with TFS (part 2)

In my previous post, we had a look at the problem : how to merge sibling branches with TFS, minimizing the number of conflicts. Yes, the number of conflicts you’ll get depends on the “technique” you’ll be using :

  • TFS baseless
  • Shelvesets
  • Other tools?

All your base are belong to TFS

But first, let’s answer the real question : what should be the base exactly for our 3-way merge?

Let me recap the scenario:

  • branch dev A from Main
  • Main evolves
  • branch dev B from Main
  • Main evolves
  • Both dev A and dev B have evolved as well
  • As a best practice, we integrate latest Main content into dev A and dev B
  • Now we want to test dev A and dev B with a single test campaign, we want to merge them all together, but leave Main intact
  • branch QA from Main, from the version that as been merge to dev A and dev B



    The base we need is the latest version of Main that has been merged into dev A and dev B. You must have merged the same version of Main into both branches, of course. The QA branch needs to be branched from the same version of Main as well. These conditions are common practice and should not be a problem.

    Here the base is not the most recent common ancestor, or it depends on what you call an ancestor. It is easy to understand : I want to merge “the diff between latest Main and dev B” into dev A. And dev A evolutions must be compare to the latest Main version that has been merged as well.

    External tools

    It is not possible to choose your base when you merge with TFS – btw, I’d be curious to know which VCS let you choose a custom base when merging.

    So let’s perform our merge “outside of TFS”. Is that bad? In the end, you won’t have any merge history between those branches, but you we really need that? What looks important to me is to keep the dev history on the dev branches for a little while, for reference, and that the QA branch future merge into Main remains easy.

    3-way folder merge procedure

    Use a local Workspace that maps the QA branch. Map also in any workspace dev A, dev B, and the Main branch (the version your merged into dev A and dev B if ever Main has further evolved).

    Merge dev A into QA with a baseless merge (easy when using VS 2012 and TFS 2012 remember last post?). Take the Source version for every conflict (easy merge isn’t it?), you can select all conflicts and choose the option.

    Let’s now use KDiff3, and its wonderful folder 3-way merge feature :


    KDiff3 is ugly, but the best merge tool I know at the moment. It is just quite clever and has nice features:


    Note that you will also loose the renames during the process, which will break the history of the renamed files. You can perform the renames in the Source Control Explorer if you like (do this before resolving the merge, and rescan afterwards).

    When finished, the local workspace (new with TFS 2012) is your friend, it will detect what has been changed, added and deleted in the Pending Changes window:


    The final tradeoff is :

    • You have less conflicts than when using TFS (even with baseless merge as explained in the previous post)
    • You break the chain of history, partially
      • In my eyes dev history is not very important, I’d be glad to debate on this. I mean not as important as maintenance history!

    If you have a large code base to merge, that should be worth it! Happy merging Smile