How to integrate Sonar with TFS (part 2)

In a recent post of mine, I introduced quickly Sonar, and how we can integrate it with TFS. I gave a few tips for its installation. Now it is time to dive in the build process and add the logic we need to make TFS builds use Sonar conveniently.

Are your really sure?

Using Sonar as I’m proposing will deport some processing from TFS to Sonar, which means TFS loses control over a certain things: unit tests are no longer launched by TFS. You *have* to use Gallio Test runner because other test reports format are not *yet* supported. The best bet is then to let Sonar launch tests through Gallio, which may have side effects!

Code analysis can no longer be configured from your projects settings, nor from the build settings (I mean you don’t want to launch it twice), so deactivate it in your builds (pick the Never option), and let Sonar launch FxCop instead.

The tooling is not always up to date, MsTest with VS 2012 was not working when it came out, which means you can be stuck if you upgrade too early. We are in the open source world, we have no guarantee it will work seamlessly with other technologies, unless you contract for a commercial support with SonarSource.

You’re still reading? Ok, you’re a pioneer now, you’ll need a good army knife and some will to make things work in your environment. Hopefully you won’t regret because it will pay off.

Build server tooling

Some tools and scripts need to be deployed onto every build server. We aim at xcopy deployment, I advise to opt for a similar folder structure that you copy from server to server. I pompously named the parent folder “Soner.Net”, and here is its contents:

SNAGHTML2a0d4c9

I cheated a bit with some products, took their installation from the “Program Files” folder on my local machine and copied them in this structure. It just worked for me.

I prefer having a private version of Java running then Sonar analysis. For this, you can just customize the sonar-runner.cmd file and update the PATH, and JAVA_HOME to point to the subfolder where you uncompressed your JDK (not JRE). Be aware of using absolute paths here!

The sonar properties

All your project parameters for the sonar analysis are in a sonar.properties file. You should place this file next to the solution file you want to analyze. I see two options for managing those files:

  1. Create a sonar.properties file for each .NET solution and add it in the source controller
    • Sounds reasonable if you don’t have too many projects
  2. Generate them automatically!
    • But this requires some build customization

The good news is that I’ve written a build Workflow Sonar activity that will generate the properties for you, or at least, help you to generate them. It is very simple, it takes a template file and replaces a few values for you. The path to the template file must be configured in the build process workflow.

Here is the template I’ve set up for using with TFS, a sample sonar-properties.template file:

# Project identification
sonar.projectKey=myCompany:%BUILD_DEFINITION_UNDERSCORE%
sonar.projectVersion=%BUILD_NUMBER_DEFAULT_SUFFIX%
sonar.projectName=%BUILD_DEFINITION%

# Info required for Sonar
sonar.sources=.
sonar.language=cs

# If multiple solutions in folder, use the following line to disambiguate
sonar.dotnet.visualstudio.solution.file=%SOLUTION_FILE%
sonar.dotnet.assemblies=$(SolutionDir)/../Binaries/$(AssemblyName).$(OutputType)
sonar.dotnet.test.assemblies=$(SolutionDir)/../Binaries/$(AssemblyName).$(OutputType)
sonar.dotnet.visualstudio.testProjectPattern=*.Tests

sonar.fxcop.mode=active
sonar.gendarme.mode=active
sonar.gallio.mode=active
sonar.ndeps.mode=active
sonar.squid.mode=active
sonar.stylecop.mode=active

sonar.fxcop.installDirectory=D:/Sonar.Net/Microsoft Fxcop 10.0
sonar.fxcop.assemblyDependencyDirectories=$(SolutionDir)/../Binaries

sonar.gallio.filter=exclude Type:/\.Integration\./
sonar.gallio.it.mode=skip
sonar.gallio.installDirectory=D:/Sonar.Net/Gallio
sonar.gallio.coverage.tool=OpenCover
sonar.gallio.runner=IsolatedProcess
sonar.gallio.timeoutMinutes=10
sonar.gallio.coverage.excludes=Microsoft.Practices.*

sonar.opencover.installDirectory=D:/Sonar.Net/OpenCover

As you can see, there are values that will be replaced at run time. You can see their description on the activity documentation page. The trick with TFS builds is that the output folder for projects is forced to a “Binaries” folder outside the scope of your sources! This template assumes it is running from a TFS build.

It should work locally

To test all these tools, fortunately, you don’t have to run builds. First, create such a properties file based on your values for a project you want to test. Make sure you comment the sonar.dotnet.assemblies and sonar.dotnet.test.assemblies properties since Sonar C# Ecosystem guess them right when TFS is not overriding the output paths. Then, in a command prompt, after having compiled the project, move to your project folder. From there, invoke the sonar.net-runner.cmd file. It should work.

Once this works for you, you are close to make it running into TFS builds, because all we need now is to launch this command line from our builds, and eventually generate the properties dynamically.

Modifying your build template

For this, you need to get the latest Community TFS Build Extensions, and deploy them into your build controller custom assemblies source control folder. See my guide here if you’re not at ease with setting up the solution for editing build templates. You may clone the DefaultTemplate.xaml and start editing it. Once your ready to inject the Sonar activity in your build template, locate the Run On Agent => Try Compile, Test, and Associate Changesets and Work Items activity, it contains the Sequence as illustrated below. You should be able to drag the “Sonar” activity as indicated.

SNAGHTML2db3042

A nice and very simple idea is to add a boolean workflow Argument named “RunSonarAnalysis”.

SNAGHTML2e95f3e

Then encapsulate the Sonar activity into an If activity.

image

If you add the proper Metadata (locate the Metadata Argument, edit it, and add the RunSonarAnalysis argument in the list) for this parameter, you’ll be able to control the Sonar execution from your build definition! That is the start of a real integration.

Finally, edit the Sonar activity properties, and you’re all set!

image

 

Now you can decide to run Sonar from your build definitions!

You may add to your workflow custom parameters (with Metadata), and pass them directly to the Sonar activity. This would allow you to pass values such as “active” or “skip” to enable or disable the plugins of your choice, on a per project basis.

It is not as complicated as it sounds to run Sonar from TFS builds. There are things that can be done better, so stay tuned for future improvements with this activity!

Testing your TFS 2012 build workflow activities

I recently posted about setting up a solution for editing build workflows (with TFS 2012), I’m going to write directly today about testing your custom activities, because I think good guides about writing them are already out there, and there (even though they talk about TFS 2010, the logic is quite the same). Today, we’ll have a look at how I we test activities without actually starting a real build.

As I’m currently writing an activity for launching Sonar during TFS builds (you can learn about Sonar with TFS here). Although I’m not finished writing the whole stuff, I can still share a few tips I learned for testing activities.

Classic testing approach

The approach is very classic : set up a test context, exercise the test, and check the results (and clean up).

The tests rely on the ability of given by Workflow Foundation to host one’s own workflow engine. So we’ll need mainly two things :

  • Create the right workflow context to emulate the behavior of build (only what’s needed for the test of course)
  • Create an instance of the activity, passing necessary parameters and stubs to fulfill the test

During our test, the workflow engine will run the activity. The activity will use objects and value from its parameters and interact with the context. Then we can add checks about what has actually happened. Ready?

Workflow activities testing tips

Creating the activity and executing it

You can instantiate a Worklow activity with a classic new statement. Literal parameters can be passed in the constructor or affected (literals here are value types and Strings). All other objects must be passed in a Dictionnary<String, Object> structure at invocation time.

  1. // constants (literals)
  2. var activity = new Sonar
  3. {
  4.     // this is a String (a literal)
  5.     SonarRunnerPath = SonarRunnerPath,
  6.     // this is a boolean (a literal as well)
  7.     FailBuildOnError = FailBuildOnError,
  8.     GeneratePropertiesIfMissing = GeneratePropertiesIfMissing,
  9.     SonarPropertiesTemplatePath = TemplatePropertiesPath,
  10.     FailBuildOnAlert = FailBuildOnAlert,
  11.     // StringList is not a workflow literal
  12.     // the following line will cause an exception at run time
  13.     ProjectsToAnalyze = new StringList("dummy.sln")
  14. };

 

 

Here all values are booleans or strings, except one StringList that will cause an error at run time (so we must remove it). Here’s how to invoke the activity (actually a workflow composed of one activity) and pass the StringList as an argument:

  1. // object variables
  2. var parameters = new Dictionary<string, object>
  3. {
  4.     { "ProjectsToAnalyze", new StringList("dummy.sln") }
  5. };
  6.  
  7. // the workflow invoker, our workflow is composed of only one activity!
  8. WorkflowInvoker invoker = new WorkflowInvoker(activity);
  9. // executes the activity
  10. invoker.Invoke(parameters);

 

Tracking build messages

You may want to check what your activity is logging, you know, when you call the TrackBuildMessage method or use the WriteBuildMessage (or Warning or Error) activity. To do this you need to set up a recorder, or more exactly TrackingParticipant. Here is a TrackingParticipant derived class that is specialized to recording build messages:

A build message tracking class
  1. namespace MyBuilds.BuildProcess.Tests
  2. {
  3.     using System;
  4.     using System.Collections.Generic;
  5.     using System.Linq;
  6.     using System.Text;
  7.     using System.Activities.Tracking;
  8.     using Microsoft.TeamFoundation.Build.Workflow.Tracking;
  9.     using Microsoft.TeamFoundation.Build.Workflow.Activities;
  10.  
  11.     /// <summary>
  12.     /// BuildMessageTrackingParticipant that logs build messages during build workflow activities
  13.     /// </summary>
  14.     public class BuildMessageTrackingParticipant : TrackingParticipant
  15.     {
  16.         private StringBuilder _sb = new StringBuilder(4096);
  17.  
  18.         public override string ToString()
  19.         {
  20.             return _sb.ToString();
  21.         }
  22.         protected override void Track(TrackingRecord record, TimeSpan timeout)
  23.         {
  24.             var buildMessage = record as BuildInformationRecord<BuildMessage>;
  25.             if (buildMessage != null && buildMessage.Value != null)
  26.             {
  27.                 _sb.AppendLine(buildMessage.Value.Message);
  28.             }
  29.  
  30.             var buildWarning = record as BuildInformationRecord<BuildWarning>;
  31.             if (buildWarning != null && buildWarning.Value != null)
  32.             {
  33.                 _sb.AppendLine(buildWarning.Value.Message);
  34.             }
  35.  
  36.             var buildError = record as BuildInformationRecord<BuildError>;
  37.             if (buildError != null && buildError.Value != null)
  38.             {
  39.                 _sb.AppendLine(buildError.Value.Message);
  40.             }
  41.         }
  42.     }
  43. }

 
To use it, all you need is to instantiate it and pass the instance to the workflow invoker:
 
  1. var workflowLogger = new BuildMessageTrackingParticipant();
  2. invoker.Extensions.Add(workflowLogger);

 
After the test, you can get the build “log” by calling the .ToString() method on the worfklowLogger instance.
 

Setting up a custom IBuildDetail instance

 
During builds, activities regularly get the “build details”, an IBuildDetail instance that contains lots of useful contextual data. This instance comes from the worfklow context, and activities get it by using code that looks like the following:
 
  1. IBuildDetail build = this.ActivityContext.GetExtension<IBuildDetail>();

 
Thankfully, it is an interface, so it is very easy to stub. I like to use the Moq mocking framework because it is very easy (yet not very powerful, but it is perfect for classic needs). Now we need to create a stub out of the IBuildDetail interface, customizing it for your needs, and to inject it in the workflow “context”. I’ll actually assemble multiple stubs together because I also need to set up the name of the build definition for my activity (yes, the activity uses the current build definition name!):
 
  1. // the build definition stub that holds its name
  2. var buildDefinition = new Mock<IBuildDefinition>();
  3. buildDefinition.Setup(d => d.Name).Returns("My Dummy Build");
  4.  
  5. // a build detail stub with the buildnumber, build definition stub and log location
  6. var buildDetail = new Mock<IBuildDetail>();
  7. buildDetail.Setup(b => b.BuildNumber).Returns("My Dummy Build_20130612.4");
  8. buildDetail.Setup(b => b.BuildDefinition).Returns(buildDefinition.Object);
  9. buildDetail.Setup(b => b.LogLocation).Returns(Path.Combine(TestContext.TestDeploymentDir, "build.log"));
  10.  
  11. // pass the stub to the invoker extensions
  12. invoker.Extensions.Add(BuildDetail.Object);

 
Now, the activity “thinks” it is using real “build details” from the build, but during tests we are using a fake object, with just the necessary values for the test to pass. So this is actually a pure classic stubbing scenario, no more.
 

Passing TFS complex objects

 
Unfortunately, not all of the objects and classes we need in build activities are interfaces, or pure virtual classes. Those are easy to stub. In the case of objects such as a Workspace, a VersionControlServer, a WorkItemStore, or a WorkItemType, you have to use more powerful stubbing frameworks such as Microsoft Fakes or Typemock.
Let’s use Fakes since it is available the Visual Studio Premium edition.
First, locate the assembly that our target type belongs to. The Workspace class belongs to the Microsoft.TeamFoundation.VersionControl.Client assembly. Right-click it in the References of your project and add a Fakes assembly:
 
image
 
Fakes processes all types and members of this assembly and dynamically generates a new reference which contain “empty” objects, with all overridable properties and members, and compatible with the original types. All types are prefixed by Shim or Stub, and methods include types of their signatures in their names. Here is an example that illustrates how to set up a Workspace “Shim”. When we call the GetLocalItemForServerItem method, it will return the value we want, that is LocalSolutionPath:
 
  1. // our workspace stub
  2. ShimWorkspace workpace = new ShimWorkspace()
  3. {
  4.     // we override String GetLocalItemForServerItem()
  5.     // and have it return a value of our own for the test
  6.     GetLocalItemForServerItemString = (s) => LocalSolutionPath
  7. };

 
To pass the actual Workspace compatible object to our activity as a parameter, use its .Instance property. Since it is not a workflow literal, let’s use the Dictionnary like we did before:
 
  1. // object variables
  2. var parameters = new Dictionary<string, object>
  3. {
  4.     { "BuildWorkspace", workpace.Instance },
  5.     { "ProjectsToAnalyze", new StringList("dummy.sln") }
  6. };

 
 

Ok, we covered a few techniques that should allow you to test most activities now. When I’m satisfied with the tests I’m currently writing, I’ll publish them in the Community TFS Build Extensions project. So keep an eye on them if you’re are interested for a full running piece of code, sorry to make you wait!

How to integrate Sonar with TFS (part 1)

Hi! Today, I’ll briefly introduce Sonar (recently renamed SonarQube) and explain a few tips on how to deploy it on Windows, having in mind to integrate it with TFS just after.

Sonar in a nutshell

Sonar is mainly a Web portal that stores everything about your builds and helps you navigate into all this data. Quality metrics are gathered by plugins of various tools (that may not come with Sonar), into a central database. The Web portal is composed of customizable dashboards, made out of customizable widgets, which can display data in various forms, with the ability to easily compare with previous builds, or see the progression through the last days or month. A drill down logic starting from any metric (such as Line of Code, violations, unit tests and coverage, etc.) will allow you to pinpoint the projects, files, and lines of code that are at the origin of their values. Various plugins (there are commercial ones) are available: they can group projects and aggregate their data, or see stats per developer for example. You can define quality profiles and select the rules that you want to apply to your projects (each rule is tied to a plugin), and create alerts when those rules obey certain conditions (too many violations, or coverage too low for the simplest).

image

Shot taken from http://nemo.sonarqube.org

Why Sonar and TFS?

Because Sonar is a great complement to TFS. It is not always easy to get the exact report we want: you’ll find reporting services and Excel reports which have to be set up with date ranges, and solutions filters. So you may have spent quite some time to configure a SharePoint dashboard. You can’t easily set thresholds that fail your builds according to some various metrics conditions. I mean, if all of this is possible because TFS is highly customizable, it is not all centralized in a single fully featured UI, and requires to use various products or technologies. Builds do not compare to each other (only the duration, and the GUI is fixed). While Excel shines at connecting to the TFS warehouse or cube, you need to be an Excel dude in order to navigate, slice, aggregate, compare data about build results. Third party tools don’t store their data into the build reports in a structured way, so you won’t get their metrics directly in the cube. While all this is possible with, really, it is not there as easily as we would want, and that is why Sonar is becoming so popular in the .NET world (and not especially with TFS).

Keep in mind that TFS is about so much more than Sonar. TFS links Work Items to code, allowing you to get an insight of real semantics in your projects (bugs and requests influence for example). Sonar focuses *only* onto the quality of your code, instantly and over time.

So we all know that Sonar is a Java application so it is evil by essence (just kidding Winking smile), but it proves to be useful even in the .NET world, thanks to the hard work of a few pioneers to write java plugins that would launch our favorite everyday tools (FxCop, StyleCop, Gendarme) and tests frameworks (with Gallio and various coverage technologies), there it is, waiting for us.

The plan to integrate Sonar

Integrating Sonar means that our TFS Builds will launch a Sonar analysis on our projects.

image

For simplicity’s sake, I’ve not represented TFS components such as build controllers, agents, etc. What is important here, is that the TFS build calls something named “Sonar runner”. This Sonar runner launches a JVM with a bootstrap that launches each plugin you have configured in your Sonar server. Each Sonar plugin then launches the appropriate native tools, gets their results and publishes them into the Sonar server. The data is stored in the Sonar database.

Installing Sonar

I’m not actually going to guide you throughout the whole installation. There is already a pretty good documentation for this, and I’m not the first to talk about Sonar under Windows, see also this install guide as well. What is sure is that you’ll need to install the SonarQube Server, the Sonar runner, and then the plugin suite named C# Ecosystem.

Nevertheless, I will give you a few tips and configuration blocks samples that will help you. Naturally, I installed Sonar against a SQL Server 2008 R2 database Smile, so create an empty database and configure the server sonar.properties this way:

sonar.jdbc.username:     <sql server user>
sonar.jdbc.password:     <password>

sonar.jdbc.url: jdbc:jtds:sqlserver://myserver;SelectMethod=Cursor;instance=SONARINSTANCE;databaseName=Sonar

# Optional properties
sonar.jdbc.driverClassName: net.sourceforge.jtds.jdbc.Driver

You’ll need a JDBC jTDS driver for using SQL Server, which is included in the Sonar server distribution (cool!), in the extensions\jdbc-driver\mssql folder. I’m not used to creating SQL Server security accounts. Since I always go integrated security, I find that managing passwords is prehistoric and unsecure practice, but I guess I have no choice.

The LDAP plugin works well, you can also get the groups your users belong to in the Active Directory.

Here is the configuration that I used my AD (spent a few hours to make it working, so I hope it will help):

# LDAP configuration
sonar.security.realm: LDAP
sonar.security.savePassword: false
sonar.authenticator.createUsers: true

ldap.url: ldap://mydc1.mydomain.com:389
ldap.user.baseDn: ou=USERS,dc=mydomain,dc=com
ldap.user.request: (&(objectClass=user)(sAMAccountName={login})) 
ldap.user.realNameAttribute: cn
ldap.user.emailAttribute: mail
 
ldap.bindDn: CN=sonarsvcaccount,OU=SERVICES ACCOUNTS,DC=mydomain,DC=com  
ldap.bindPassword: sonarsvcpassword
ldap.group.baseDn: OU=GROUPS,dc=mydomain,dc=com
ldap.group.request: (&(objectClass=group)(member={dn}))

It is horrible and terrible, I known, I could not avoid to put the sonar service account password in the configuration file, protect this file!

Finally, set up Sonar as a service of course (with the sonar service account aforementioned).

That’s all for today folks! Next post I’ll talk about all the build and analysis stuff!

Merging feature branches for QA with TFS (part 2)

In my previous post, we had a look at the problem : how to merge sibling branches with TFS, minimizing the number of conflicts. Yes, the number of conflicts you’ll get depends on the “technique” you’ll be using :

  • TFS baseless
  • Shelvesets
  • Other tools?

All your base are belong to TFS

But first, let’s answer the real question : what should be the base exactly for our 3-way merge?

Let me recap the scenario:

  • branch dev A from Main
  • Main evolves
  • branch dev B from Main
  • Main evolves
  • Both dev A and dev B have evolved as well
  • As a best practice, we integrate latest Main content into dev A and dev B
  • Now we want to test dev A and dev B with a single test campaign, we want to merge them all together, but leave Main intact
  • branch QA from Main, from the version that as been merge to dev A and dev B

     

    SNAGHTML9efd54_thumb

    The base we need is the latest version of Main that has been merged into dev A and dev B. You must have merged the same version of Main into both branches, of course. The QA branch needs to be branched from the same version of Main as well. These conditions are common practice and should not be a problem.

    Here the base is not the most recent common ancestor, or it depends on what you call an ancestor. It is easy to understand : I want to merge “the diff between latest Main and dev B” into dev A. And dev A evolutions must be compare to the latest Main version that has been merged as well.

    External tools

    It is not possible to choose your base when you merge with TFS – btw, I’d be curious to know which VCS let you choose a custom base when merging.

    So let’s perform our merge “outside of TFS”. Is that bad? In the end, you won’t have any merge history between those branches, but you we really need that? What looks important to me is to keep the dev history on the dev branches for a little while, for reference, and that the QA branch future merge into Main remains easy.

    3-way folder merge procedure

    Use a local Workspace that maps the QA branch. Map also in any workspace dev A, dev B, and the Main branch (the version your merged into dev A and dev B if ever Main has further evolved).

    Merge dev A into QA with a baseless merge (easy when using VS 2012 and TFS 2012 remember last post?). Take the Source version for every conflict (easy merge isn’t it?), you can select all conflicts and choose the option.

    Let’s now use KDiff3, and its wonderful folder 3-way merge feature :

    image_thumb

    KDiff3 is ugly, but the best merge tool I know at the moment. It is just quite clever and has nice features:

    SNAGHTMLface99_thumb

    Note that you will also loose the renames during the process, which will break the history of the renamed files. You can perform the renames in the Source Control Explorer if you like (do this before resolving the merge, and rescan afterwards).

    When finished, the local workspace (new with TFS 2012) is your friend, it will detect what has been changed, added and deleted in the Pending Changes window:

    image_thumb1

    The final tradeoff is :

    • You have less conflicts than when using TFS (even with baseless merge as explained in the previous post)
    • You break the chain of history, partially
      • In my eyes dev history is not very important, I’d be glad to debate on this. I mean not as important as maintenance history!

    If you have a large code base to merge, that should be worth it! Happy merging Smile

  • Merging feature branches for QA with TFS (part 1)

    or How to merge sibling branches with the least conflicts as possible

    In this post series I’ll propose a simple procedure to merge feature branches among them without merging into Main. This is particularly useful when you want to test the features altogether (or by groups of features) without bringing yet all the content onto the Main branch. If you have a large code base, you also want to avoid conflicts as much as possible.

    image_thumb3

    Team Foundation Version Control (TFVC) is a great version control system as long as your respect the branching and merging guide produced by the ALM Rangers. You can handle pretty complex situations and have branching trees such as this:

    image_thumb

    © Visual Studio ALM Rangers

    All is well under control as long as you merge between branches that have parent-child relationship. Should you need to merge between any pair of branches, you can always :

    • Use a TFS “baseless” merge
    • Use a Shelveset (code stored at server side in a separated storage that you merge onto anything)

    TFS baseless merges (really baseless?)

    To achieve a baseless merge you have to type a command like:

    tf merge /baseless $/Project/SourceBranch $/Project/DestBranch /recursive

    More on the merge command here. That will create a merge relationship between two branches with no obvious connection (read parent-child). Now the merge relationship is established, subsequent merges are then managed by TFS UI (Visual Studio) and history engine. Good.

    But still, this king of merge is not very satisfactory because TFS is not very good at picking the most recent common ancestor “in a clever way”. Don’t laugh too fast, Git is not very clever either at picking up the best base.

    Here is the best scenario you can get with TFS when merging sibling branches: the base is the origin of the branch you are merging from. When merging with the command line, you should use the /version parameter to select only wanted changesets, don’t take the changeset that created the branch, or the whole branch will be a conflict. Take only the changesets your are interested in, as explained by Buck Hodges here. In this precise case (schema below), I selected all the changesets but the one that created the branch dev B, in order to merge its contents into dev A.

    SNAGHTML2ce926_thumb

    I’m lying, the UI can do it!

    Ok, sorry, but since this feature is new in TFS 2012, don’t blame me too much if put in my article a bit of TFS history!

    So with TFS 2012, it is now very easy to perform baseless merges in the UI. You can edit manually the target branch in the merge wizard. Then choose to pick the changesets you want:

    image_thumb6

    Then select the changesets you want to merge (without the branch creation):

    image_thumb9

    Made with Brien Keller VS 2012 VM Smile

    Using Shelvesets

    The trick is to use the TFS power tools to unshelve your content to a different branch. The command will “translate” server paths to another location (that you must have mapped in your Workspace):

    tfpt unshelve /migrate /source:$/Project/SourceBranch /target:$/Project/TargetBranch

    They are very handy but don’t expect a top notch merging experience regarding to the conflicts that are generated. Before TFS 2010 SP1 baseless merges improvements, they were kings for moving fixes from branch to branch. Now, that baseless have improved, I’m not so sure, I go “baseless” more and more.

     

    Next post I’ll talk about a solution, there is no magic, we’ll use external tools to achieve our goal. Till then, just think, what base is the best in such a case?

    Stay tuned!

    Visual Studio 2012 Express editions feature matrix

    Microsoft has released in last September Visual Studio Express 2012 for Windows Desktop (see various blog posts about it here, and here) which allows many development scenarios and fills the gap left by the two other Express editions of VS 2012 out there which are Visual Studio Express 2012 for Web, Visual Studio Express 2012 for Windows 8. Later on, at the //build/ 2012 conference, Visual Studio Express 2012 for Windows Phone was announced and launched (embedded in the Windows Phone 8 SDK).

    The official pages about Visual Studio shows a nice edition comparison page for Pro, Premium, Test and Ultimate editions. But what about Express editions ? What are their features and how do they compare ?

    Visual Studio 2012 Express editions comparison table

      Visual Studio 2012 Express for Web Visual Studio 2012 Express for Windows Desktop Visual Studio 2012 Express for Windows 8 Visual Studio 2012 Express for Windows Phone
    OS Support Windows 7 SP1 (x86 and x64)
    Windows 8 (x86 and x64)
    Windows Server 2008 R2 SP1 (x64)
    Windows Server 2012 (x64)
    Windows 7 SP1 (x86 and x64)
    Windows 8 (x86 and x64)
    Windows Server 2008 R2 SP1 (x64)
    Windows Server 2012 (x64)
    Windows 8 only Windows 8 (x64) only
    Supported architectures 32-bit (x86)
    64-bit (x64)
    32-bit (x86)
    64-bit (x64)
    32-bit (x86)
    64-bit (x64)
    64-bit (x64)
    Languages and tools C#, VB.NET, F# with extension package C#, VB.NET, C++, XAML Javascript, C#, VB.NET,C++ C#, VB.NET, XAML, C++
    Toolsets SQL Data Tools, Web Developer Tools SQL Data Tools Code Analysis Spell Checker Code Analysis Spell Checker, Windows Phone SDK 8.0, XNA Game Studio 4.0, Advertising SDK for Windows Phone
    Target environment restrictions .NET 4.0 and 4.5
    Just Install Framework 3.5 in order to target .NET 2.0, 3.0 and 3.5
    .NET 4.0 and 4.5
    Just Install Framework 3.5 in order to target .NET 2.0, 3.0 and 3.5
    Windows App Store only Windows Phone 7.1 and 8
    Emulator for Windows Phone 7.1 and 8 (with limitations)
    Project types Class libraries, ASP.NET Web Forms, MVC 3 & 4, Dynamic Data, Server controls & AJAX Server controls and extenders, Silverlight applications, libraries & Navigation applications, Test Project C# & VB: Windows Forms Application, WPF Application, Console Application, Class Library, Unit Test Project

    C++: Class Library, CLR Console Application, Managed Test Project, Native Unit Test Project, Win32 Console Application, Win32 Project

    JavaScript app, Class Library (for Store), Windows Runtime Component (for Store), Test Library (for Store apps) C# & VB.NET : Windows Phone App, Databound App, Class Library, Panorama App, Pivot App, XAML and Direct3D App, XAML and XNA App, HTML5 App, Audio & Tasks Background Agents

    XNA (C# & VB.NET ) : Windows Phone Game, Game Library, Content Pipeline & Projects

    C++ (Windows Phone) : Direct3D with XAML, Direct3D App (Native), Runtime Component, Empty Dll, Empty Static Library

    Additional elements / features All ASP.NET and Web files
    Resources, Datasets, Web Services (client and server)

    Page Inspector

    WCF Configuration Editor

    T4 & Custom templates

    Framework components support : Windows Forms, WPF, User Controls (WPF and Forms), Resources, Datasets, Web Services (client and server)

    Call Hierarchy

    T4 & Custom templates

    All Windows Store files (XAML, js) , Resources, XSLT

    Installed with Blend for Visual Studio 2012

    Windows Store Management (STORE menu)

    Call Hierarchy

    Store Test Kit, Device Window and Format Menu, Device Simulation Dashboard (network, lock screen, reminders)

    Installed with Blend for Visual Studio 2012

    Call Hierarchy

    XML / XSLT features XML editing with dynamic XSD validation and Intellisense
    Packaging / Publishing Publish Web Application Wizard
    Web Deployment Packaging capabilities
    SQL Deployment capabilities
    Publish Wizard Windows app packaging, Upload to windows store Deploy to device, Store Test Kit (automated and manual tests to make sure your app has the minimum requirements for Store)
    Debugging Debug ASP.NET, Edit & Continue, ASP.NET Development Server, Local IIS, IIS Express Attaching to an external process is enabled Attaching to an external process is enabled.
    Run apps on Local Machine, Remote Machine, Simulator
    Attaching to an external process is enabled
    Common Features and goodness Solutions can contain multiple projects (no previous Express editions constraints)

    Object explorer, Tasks lists, Class View, Snippets Manager, Web Services (Add Service References)

    Nuget (Library Manager) full support

    Help Viewer with local or online content

    Import / Export Visual Studio Settings

    Performance and code analysis features No Very basic, no analyze menu, no ruleset configuration Code Analysis (Very basic)

    Performance Analyzer (rich reporting and comparison, summary, call tree, modules, etc.)

    Code Analysis (Very Basic)

    Performance Analysis, Windows Phone App Analysis (Monitoring & Profiling)

    Databases SQL Server data tools
    Database Explorer
    Entity Framework support
    SQL Server data tools
    Database Explorer
    Entity Framework support
    No No
    UML Features None
    Optional features and capabilities Support for F# language by installing F# Tools Windows Azure Development enabled by installation Azure SDK No Comes with Blend for Visual Studio 2012 Comes with Blend for Visual Studio 2012
    Extensibility and plugins Extension Manager is there but no support for addins nor extensions
    TFS connectivity Yes : Team Explorer for TFS 2012 included
    Conditions Requires online registration Requires online registration Requires online registration (works for Blend too)
    Requires Windows 8 Developer License
    Requires online registration
    Windows Phone Developer License for Store publishing
    Unit Testing Yes, full support but limited to MsTest (can’t install other frameworks adapters) No
    Main constraints summary No C++, no attach to process, no Console or Windows Native projects, No Forms nor XAML, No Call Hierarchy Window No T4 support No Windows Phone 8 Emulator on other systems than Windows 8 x64 at least Pro with SLAT enabled (can’t use VirtualBox nor VMWare)

    No T4 support

     

    Disclaimer : I’ve been very careful filling this table, it is the result of my very own experience which each edition. This comparison is not official and subject to change with updates. Please report to me any inconsistency.

    Visual Studio 2012 makes life much easier for Code Analysis

    It may look like it is a small detail, but for me it makes a big change: you can now launch Code Analysis at Solution level!

    image_thumb1_thumb

    That means only *one* Code Analysis action before checking-in, instead of having to remember which projects have been touched in your solution and launch the analysis separately for each project (as we did VS 2010)!

    I’ll take the occasion to talk a bit about Code Analysis configuration.

    Per project rule sets

    First let me remind that rule sets are configurable in the project properties of each project, and can vary by configuration (Debug, Release, etc.).

    image_thumb_thumb

    I won’t advise here how to organize *your* rules here, whether it is best or not to have different rule sets for your projects or one rule set “to rule them all” (sorry, couldn’t help). It just depends on what works best for you and your teams. Here’s just an example of what can be done:

    SNAGHTML7e84ba9_thumb_thumb

    Sharing rule sets

    You can easily make the project point to a rule set file stored on a network share. This is something you really want if you have many projects and solutions in your company.

    Another great way to share rule sets is the source controller itself, the path to the rule set is stored in a relative form in the project file:

    <CodeAnalysisRuleSet>..\..\Rules\MyCompanyCore.ruleset</CodeAnalysisRuleSet>

    If you have custom house rules, you can ship them along with your rule sets files. You’ll have to edit the rule set file and add the following Xml node:

    <RuleHintPaths>
      <Path>\\server\Rules\2012\MyCompany.Base.Rules.FxCop.dll</Path>
    </RuleHintPaths>

    Sharing the rules via the source controller (rules are in the project stucutre) works great for isolated projects and distributed contexts. But if you have a big code base you have to place your rule files somewhere at the top of your folder hierarchy, or add a mapping entry in all your workspaces. Moreover, it seems you may have trouble using custom dll rules because the RuleHintPaths are absolute and not relative.

    The network approach looks easier, especially with custom rules, but you may encounter nasty load file problems, I’m still trying to solve that kind of problem for one of my clients, some computers would just not manage to execute the rules (I’ll post here when I find the solution).

    Code Analysis for the build

    The build server will also run Code Analysis, so you have to make sure your rule sets are available to the build process (workspaces, network paths, etc.). Generally, they will. This is the easy part, you have multiple options:

    image_thumb2_thumb

    • AsConfigured: will obey what you have set up each project Code Analysis settings (see the Enable Code Analysis on build option in the screenshot above)
    • Always: will force Code Analysis for every project, even if the aforementioned option is not checked
    • Never: will force CA not to run…

    It is simple and easy, there is no need to create a new project configuration named “Debug with CA” and check the “Enable Code Analysis on build” option in every project, then configure the build to use this configuration. No, we don’t need to do that! Smile

    I’d be curious to know how your share your custom rules in your company, feel free to drop a comment Smile

    Error during VS 2012 with update 1 install, just re-install update 1

    I’ve just installed Visual Studio 2012 with update 1 (the ISO is available for in MSDN subscribers downloads) on a test TFS 2012 server.

    Visual Studio with update 1 install

    The install went well, I clicked “LAUNCH” and chose the C# development settings (first launch only dialog), but then an error occurred :

    devenv.exe crashed because of an error coming from Microsoft.VisualStudio.Progression.LanguageService.CSharp

    After re-launch, Every time I would right-click in my Solution Explorer, Visual Studio would continue crashing :

    Application: devenv.exe
    Framework Version: v4.0.30319
    Description: The process was terminated due to an unhandled exception.
    Exception Info: System.MissingFieldException
    Stack: etc.
       at Microsoft.VisualStudio.Progression.LanguageService.CSharp.CSLanSvcProvider.InitializeProvider

    I was not the only one, as you may find here, and here, but re-installing only the update 1 did the trick, if it does not for you, you may have to fully re-install Visual Studio.

    Migrating Coded UI Tests to VS 2012: small issue with project dependencies

    I’ve upgraded my Visual Studio 2010 Coded UI tests to Visual Studio 2012, and I faced a small issue. The migration considerations are documented in MSDN here. For me, it did not work out of the box, my tests were not found by the Test Explorer, and as soon as I could fix this, tests would not run either, spawning strange exceptions I never had before. I’ll describe below how I finally fixed my Visual Studio solution. The problem is covered by this blog post by the ALM team, but I’m providing the symptoms, and the resolution process in a detailed way. In case people face similar situations, I’m including the intermediate steps, but the real answer to my problem is in the middle of the post (emphasized with bold red font).

    How my solution is structured

    SNAGHTML68ec91

    A smooth migration

    When opening the solution with VS 2012, the projects containing Coded UI tests are “repaired”: some magic is performed in the project files in order to keep the compatibility with VS 2010 SP1. This cross-compatibility between VS 2010 SP1 and VS 2012 is quite cool, both versions can concurrently develop on the same project, unusual but true!

    The “migration” log reports the following message:

    FrontSiteUiTests.csproj: Visual Studio has made non-functional changes to this project in order to enable the project to open in this version and Visual Studio 2010 SP1 without impacting project behavior.
     

    But where are my Coded UI Tests ?

    In VS 2012, there is no more Test View, but a Test Explorer. The Test Explorer has a discovery process which asks you to compile the solution if no tests are found. Despite hammering CTRL+SHIFT+B on my keyboard, no test showed up in the box.

    image

    What to do then ? Well, there is a new Visual Studio Output for the Testing tools, you’ll find there discovery errors messages. Head to the Output Window, next to “Show output from:”, select “Tests”. The following error message is displayed multiple times:

    Error loading C:\Sources\Platform Tests\Company\CodedUITests\FrontSiteUiTests\bin\Debug\Company.Testing.Ui.FrontSiteUiTests.dll: Could not load file or assembly ‘Microsoft.VisualStudio.QualityTools.CodedUITestFramework, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a’ or one of its dependencies. The system cannot find the file specified.

    Accusing inheritance among Coded UI tests

    I first thought the error was coming from the structure of my coded UI tests. I have a base class for all my Coded UI tests, this class is decorated by a CodedUITestAttribute, and moreover, there is an intermediate class in the inheritance tree. This one also has a CodedUITestAttribute on it.

    I’ve always thought this was a small trick, but VS 2010 seemed support it, though I’m not sure about future versions… So I tried different combinations of attributes on the base and intermediate classes, varying from no attribute, to TestClass, to CodedUITest. I thought I was in the right direction since tests would now appear in the Test Explorer (restarting VS helped Winking smile ).

    SNAGHTML6dd51b

    The details are no real matter here, but I faced various exception messages while playing with attributes, for reference sake I’ll post them here:

    At compile time with no attribute in the base class, so TestClass or CodedUITest is at least required:

    UTA005: Illegal use of attributes on Company.Testing.Ui.Core.CdsUITestBase.MyTestInitialize.The TestInitializeAttribute can be defined only inside a class marked with the TestClass attribute.
    UTA006: Illegal use of attributes on Company.Testing.Ui.Core.CdsUITestBase.MyTestCleanup. The TestCleanupAttribute can be defined only inside a class marked with the TestClass attribute.

    At run time, using TestClass instead of CodedUITest :

    Initialization method Cdiscount.Testing.Ui.OrderProcess.Check_Customer.Customer_Login.CustomerLogin.MyTestInitialize threw exception. System.IO.FileNotFoundException: System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.VisualStudio.TestTools.UITesting, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a’ or one of its dependencies. The system cannot find the file specified.WRN: Assembly binding logging is turned OFF.

    This message (similar to the one reported by the discovery process) pointed out the real problem: VS 2012 should use version 11.0.xxx references and not 10.0.xxx

    The problem is actually pretty simple, the wizard has upgraded my Coded UI Test project but not the project that contains the base and utility classes. There is no messing around with attributes involved.

    So all we need is to reproduce the “magic” on our referenced projects.

    Additionally, beware of external dependencies directly built upon version 10.0.xxx of the testing tools assembly set, you’ll have to rebuild them upon v11.0.xxx.

    The problem was also reported and solved here.

    Upgrading (or reparing) projects manually

    Disclaimer: although I’m exposing here as a commodity the implementation of this “magic” performed by VS 2012 update 1, I advise to watch closely at what is done to your own Coded UI test projects as a base. Simply compare the changes before checkin-in your proj file. That said, the following should work for you.

    In the first PropertyGroup with most global properties:

    <VisualStudioVersion Condition="'$(VisualStudioVersion)' == ''">10.0</VisualStudioVersion>
    <VSToolsPath Condition="'$(VSToolsPath)' == ''">$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v$(VisualStudioVersion)</VSToolsPath>
    <ReferencePath>$(ProgramFiles)\Common Files\microsoft shared\VSTT\$(VisualStudioVersion)\UITestExtensionPackages</ReferencePath>
    <IsCodedUITest>True</IsCodedUITest>
    <FileUpgradeFlags>
    </FileUpgradeFlags>
    <OldToolsVersion>4.0</OldToolsVersion>
    <UpgradeBackupLocation />

    After the last ItemGroup and before the last Import:

    <Choose>
      <When Condition="'$(VisualStudioVersion)' == '10.0' And '$(IsCodedUITest)' == 'True'">
        <ItemGroup>
          <Reference Include="Microsoft.VisualStudio.QualityTools.CodedUITestFramework, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
            <Private>False</Private>
          </Reference>
          <Reference Include="Microsoft.VisualStudio.TestTools.UITest.Common, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
            <Private>False</Private>
          </Reference>
          <Reference Include="Microsoft.VisualStudio.TestTools.UITest.Extension, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
            <Private>False</Private>
          </Reference>
          <Reference Include="Microsoft.VisualStudio.TestTools.UITest.Extension.Firefox, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
            <Private>False</Private>
          </Reference>
          <Reference Include="Microsoft.VisualStudio.TestTools.UITest.Extension.Silverlight, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
            <Private>False</Private>
          </Reference>
          <Reference Include="Microsoft.VisualStudio.TestTools.UITesting, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
            <Private>False</Private>
          </Reference>
        </ItemGroup>
      </When>
    </Choose>
    <Import Project="$(VSToolsPath)\TeamTest\Microsoft.TestTools.targets" Condition="Exists('$(VSToolsPath)\TeamTest\Microsoft.TestTools.targets')" />

    Next, in the <Reference … /> node under <ItemGroup>, delete all Reference to assemblies like Microsoft.VisualStudio.QualityTools.* and Microsoft.VisualStudio.TestTools.*, except for UnitTestFramework, so leave this entry:

    <Reference Include="Microsoft.VisualStudio.QualityTools.UnitTestFramework, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL" />

    These evolutions actually make the ReferencePath depend on the version of Visual Studio. Cool.

    Now all the references, including those from the utility project are mapped to the correct version, that is v11.0.0.0:

    image

    So in the end everything’s fine, tests are properly discovered, and run fine as they just ran with VS 2010.

    Finalizing the migration

    There have been deep evolutions in the testing tools between VS 2010 and VS 2012, for example vsmdi files, which used to contain test lists, are no longer supported. This is all well documented in MSDN. This may be qualified as a regression in functionality, but personally I absolutely don’t feel angry at all or even surprised, those lists needed to evolve, I’ve never been keen on them and I’m quite happy the way it is today. Now, the test tagging system is the only rightful way to categorize tests and define running lists: just as it has always been with other testing frameworks. Less ambiguity, better tools, more productivity.

    You’ll surely want to add multiple categories to single test methods, it is good practice:

    [TestMethod, TestCategory("Daily"), TestCategory("Nightly"), Priority(1), Description("Check the customer login")]
    public void CodedUITestMethod1()

    eg: some database tests may be eligible for your nightly builds.

    image

    Just in case

    As a final word, just in case you have some mess in your class attributes for your Coded UI Tests, TestClass is not suitable as base class for Coded UI Tests, as you will have the following message at run time:

    Result Message:    Initialization method Cdiscount.Testing.Ui.OrderProcess.Check_Customer.Customer_Login.CustomerLogin.MyTestInitialize threw exception. Microsoft.VisualStudio.TestTools.UITest.Extension.TechnologyNotSupportedException: Microsoft.VisualStudio.TestTools.UITest.Extension.TechnologyNotSupportedException: The browser  is currently not supported..

    Merging Team projects: numbers and final thoughts

    I will now conclude and share my thoughts about this whole operation. But before that let’s have a look at a few numbers.

    Numbers

    Volume

    • 13 branches moved (only the Main branches were moved), approx 800Mb of source code in total
    • 250+ builds moved with their msbuild scripts (they were legacy, highly customized builds from TFS 2008)
    • 134.000+ work items moved (I announced 120.000, but this has raised faster than I had planned)
    • Average WI migration rate: 900 work items per hour, that is 5 full days, but a bit more in practice, because I split the workload in chunks
    • 162.000+ WI links restored, covering 134.000+ Work items
    • Average WI link restoration rate: 400 WI processed per minute

     

    Investment

    • 28 days to plan, conceive, communicate, set up and execute the migration
    • 4 days of external help
    • a few meetings
    • 50 days of development into total to bring back the evolutions into the newly created dev branch (story here)

    Benefits

    • Increased productivity for everyone, having multiple team projects for the same final product was a bit confusing
      • Developers
      • Code integratorsimage
    • The starting point to have a unified documentation of all internal processes (simplified by the way), and the starting point of many other ALM improvements
    • Morale of troops: this move was the proof that managers cared about their dev infrastructure, and new comers would not find a mess

    Conclusion

    Given the costs and benefits, I think my client did the right choice. I was so much convinced it was the road to follow. I admit I spent more time on it than initially planned, but people around me were really motivated into doing this, it was the first time something big was done for the sake of industrialization without carefully calculated benefits (just like agile processes). This was also the starting point for other work streams about ALM improvements at my client’s.

    If I had the chance to find the kind of articles I’ve just posted, things would have been a bit easier. I’ve given:Package by Anonymous - Package icon by Fr�d�ric Moser. From old OCAL website.

    • An overall approach and methodology for every aspect
    • A procedure and work-arounds for moving the branches
    • Tools and technical info for moving the Work Items
    • A sample tool for moving the build definitions
    • Concrete numbers to help planning

    I finally encourage the readers that are interested to merge their team projects if they feel that the projects are part of a single big product (from an external point of view), it’s just like an agile practice, an investment to make things more cohesive and, with time, avoid wasting time and money, because technical things are just reflecting reality. I just hope you’ll find these pieces of information useful, feel free to ask any question, and good luck with your migrations!