Visual Studio 2012 makes life much easier for Code Analysis

It may look like it is a small detail, but for me it makes a big change: you can now launch Code Analysis at Solution level!


That means only *one* Code Analysis action before checking-in, instead of having to remember which projects have been touched in your solution and launch the analysis separately for each project (as we did VS 2010)!

I’ll take the occasion to talk a bit about Code Analysis configuration.

Per project rule sets

First let me remind that rule sets are configurable in the project properties of each project, and can vary by configuration (Debug, Release, etc.).


I won’t advise here how to organize *your* rules here, whether it is best or not to have different rule sets for your projects or one rule set “to rule them all” (sorry, couldn’t help). It just depends on what works best for you and your teams. Here’s just an example of what can be done:


Sharing rule sets

You can easily make the project point to a rule set file stored on a network share. This is something you really want if you have many projects and solutions in your company.

Another great way to share rule sets is the source controller itself, the path to the rule set is stored in a relative form in the project file:


If you have custom house rules, you can ship them along with your rule sets files. You’ll have to edit the rule set file and add the following Xml node:


Sharing the rules via the source controller (rules are in the project stucutre) works great for isolated projects and distributed contexts. But if you have a big code base you have to place your rule files somewhere at the top of your folder hierarchy, or add a mapping entry in all your workspaces. Moreover, it seems you may have trouble using custom dll rules because the RuleHintPaths are absolute and not relative.

The network approach looks easier, especially with custom rules, but you may encounter nasty load file problems, I’m still trying to solve that kind of problem for one of my clients, some computers would just not manage to execute the rules (I’ll post here when I find the solution).

Code Analysis for the build

The build server will also run Code Analysis, so you have to make sure your rule sets are available to the build process (workspaces, network paths, etc.). Generally, they will. This is the easy part, you have multiple options:


  • AsConfigured: will obey what you have set up each project Code Analysis settings (see the Enable Code Analysis on build option in the screenshot above)
  • Always: will force Code Analysis for every project, even if the aforementioned option is not checked
  • Never: will force CA not to run…

It is simple and easy, there is no need to create a new project configuration named “Debug with CA” and check the “Enable Code Analysis on build” option in every project, then configure the build to use this configuration. No, we don’t need to do that! Smile

I’d be curious to know how your share your custom rules in your company, feel free to drop a comment Smile

Error during VS 2012 with update 1 install, just re-install update 1

I’ve just installed Visual Studio 2012 with update 1 (the ISO is available for in MSDN subscribers downloads) on a test TFS 2012 server.

Visual Studio with update 1 install

The install went well, I clicked “LAUNCH” and chose the C# development settings (first launch only dialog), but then an error occurred :

devenv.exe crashed because of an error coming from Microsoft.VisualStudio.Progression.LanguageService.CSharp

After re-launch, Every time I would right-click in my Solution Explorer, Visual Studio would continue crashing :

Application: devenv.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.MissingFieldException
Stack: etc.
   at Microsoft.VisualStudio.Progression.LanguageService.CSharp.CSLanSvcProvider.InitializeProvider

I was not the only one, as you may find here, and here, but re-installing only the update 1 did the trick, if it does not for you, you may have to fully re-install Visual Studio.

MVP Status renewed for 2013

I’m glad to announce that I’ve been renewed as a Microsoft Most Valuable Professional in the Visual Studio ALM domain for 2013!  Smile

MVP Logo

Every year I’m challenged on my contributions to the Visual Studio ALM community, and my efforts with this blogging site have been rewardful (but not only). Year 2012 was a year of transition for me, moving from Paris to Bordeaux, with personal events to manage (I know I’m no exception), but I feel more comfortable to dedicate more time to my expertise this year.Happy new year


The time for Champagne wine, foie gras (unavoidable when we celebrate new year in France) and good resolutions is over, let’s hit the road, I’ve plenty of ideas and areas of work for year 2013!

And by the way, happy new year to you fellow reader! Smile

Snagit + SkyDrive, just what I needed…

Noticed my cool screenshots and captures ? Ok, lame intro, but I’m so pleased with those tools…

I’m using Snagit for my screenshots and editing, then I set it up to save my files in my SkyDrive folder, so my work is backed up on the fly. I feel I have only advantages so I’m sharing this small combo.

As a technical blogger, the features I need

  • A handy screen-capturing interface, capturing pieces of screen with a pixel precision
  • The ability to add graphical objects on top of my shots: notes, squares, circles, arrows, text easily
  • The ability to modify those graphical objects without impacting the base image
  • Cool borders effects (torn, blended, etc.)
  • A lot of editing options for the graphical objects
  • A view of all my recent shots (a kind of explorer list)
  • The ability to name my screenshot with my own policy and save them all in sequence to a specified folder

I’ve been striving for this on the internet, with the first three features in mind. There are not that many offers. I tried the most popular free screen capturing tools, but I was disappointed. I started to look for non-free tools, and Snagit just did it too well.


All the aforementioned features are fulfilled by Snagit. To my eyes, the strengths of Snagit are:

  • “.snag” files: images are saved in a proprietary format that allows quick editing, and support quick modifications without altering the original image, this is so important to me
  • Easy interface with many options
  • I can save my favorite arrows and text formatting parameters in the quick style bar, this is good for my blog identity and helps keeping it homogenous (and productive)
  • I almost never go through a Save file dialog, files are automatically named, placed and saved in the right place

Custom profiles for different screen capturing contexts

To achieve the latter point, I set up a Snagit profile for my blog entries and tied it with a particular key combination (CTRL+SHIFT+S), so that the regular Print Screen key would still trigger Snagit, but not in the context of my “blog” profile, thus, not polluting my blog screen captures folder.


I’m sure I’m not using a quarter of all the features, I just use what I need and I’m very happy like this.

Cloud for easiness

Outputting files directly to a SkyDrive folder provides even more comfort. All my work in progress, my live writer drafts and their snag images are backed up on the fly.


I could use DropBox, but I’m more and more using SkyDrive because of its great value. 7 Gb free for a starter, and I bought an additional 20 Go for 8€ per year (that is the price of a basic lunch for me). DropBox is cool and I’ve loved it for quite some time, I’ve earned up to 8,75 Gb of free space just by convincing people to use it! Upgrading DropBox adds 100Gb for 99$ per year, there is no middle ground, too bad.

SkyDrive has been updated with cool features recently, it has now a proper trash bin, a selective file history (not for all file types, as strange as it sounds Sad smile ), and is able to sync from multiple folders on the same machine.

I was a Live Mesh lover, and SkyDrive is my new hero, cheap, works well, I’ll be fully satisfied with it when it has a file history for all file types, not just office files.

Today, I’m using both, DropBox for my personal files, and SkyDrive for all my professional needs and techie archives. And having my work being backed up in real time saved my life a few times already, since I’ve had bad experiences with my SSD, I’m becoming good at backuping stuff now…

Migrating Coded UI Tests to VS 2012: small issue with project dependencies

I’ve upgraded my Visual Studio 2010 Coded UI tests to Visual Studio 2012, and I faced a small issue. The migration considerations are documented in MSDN here. For me, it did not work out of the box, my tests were not found by the Test Explorer, and as soon as I could fix this, tests would not run either, spawning strange exceptions I never had before. I’ll describe below how I finally fixed my Visual Studio solution. The problem is covered by this blog post by the ALM team, but I’m providing the symptoms, and the resolution process in a detailed way. In case people face similar situations, I’m including the intermediate steps, but the real answer to my problem is in the middle of the post (emphasized with bold red font).

How my solution is structured


A smooth migration

When opening the solution with VS 2012, the projects containing Coded UI tests are “repaired”: some magic is performed in the project files in order to keep the compatibility with VS 2010 SP1. This cross-compatibility between VS 2010 SP1 and VS 2012 is quite cool, both versions can concurrently develop on the same project, unusual but true!

The “migration” log reports the following message:

FrontSiteUiTests.csproj: Visual Studio has made non-functional changes to this project in order to enable the project to open in this version and Visual Studio 2010 SP1 without impacting project behavior.

But where are my Coded UI Tests ?

In VS 2012, there is no more Test View, but a Test Explorer. The Test Explorer has a discovery process which asks you to compile the solution if no tests are found. Despite hammering CTRL+SHIFT+B on my keyboard, no test showed up in the box.


What to do then ? Well, there is a new Visual Studio Output for the Testing tools, you’ll find there discovery errors messages. Head to the Output Window, next to “Show output from:”, select “Tests”. The following error message is displayed multiple times:

Error loading C:\Sources\Platform Tests\Company\CodedUITests\FrontSiteUiTests\bin\Debug\Company.Testing.Ui.FrontSiteUiTests.dll: Could not load file or assembly ‘Microsoft.VisualStudio.QualityTools.CodedUITestFramework, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a’ or one of its dependencies. The system cannot find the file specified.

Accusing inheritance among Coded UI tests

I first thought the error was coming from the structure of my coded UI tests. I have a base class for all my Coded UI tests, this class is decorated by a CodedUITestAttribute, and moreover, there is an intermediate class in the inheritance tree. This one also has a CodedUITestAttribute on it.

I’ve always thought this was a small trick, but VS 2010 seemed support it, though I’m not sure about future versions… So I tried different combinations of attributes on the base and intermediate classes, varying from no attribute, to TestClass, to CodedUITest. I thought I was in the right direction since tests would now appear in the Test Explorer (restarting VS helped Winking smile ).


The details are no real matter here, but I faced various exception messages while playing with attributes, for reference sake I’ll post them here:

At compile time with no attribute in the base class, so TestClass or CodedUITest is at least required:

UTA005: Illegal use of attributes on Company.Testing.Ui.Core.CdsUITestBase.MyTestInitialize.The TestInitializeAttribute can be defined only inside a class marked with the TestClass attribute.
UTA006: Illegal use of attributes on Company.Testing.Ui.Core.CdsUITestBase.MyTestCleanup. The TestCleanupAttribute can be defined only inside a class marked with the TestClass attribute.

At run time, using TestClass instead of CodedUITest :

Initialization method Cdiscount.Testing.Ui.OrderProcess.Check_Customer.Customer_Login.CustomerLogin.MyTestInitialize threw exception. System.IO.FileNotFoundException: System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.VisualStudio.TestTools.UITesting, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a’ or one of its dependencies. The system cannot find the file specified.WRN: Assembly binding logging is turned OFF.

This message (similar to the one reported by the discovery process) pointed out the real problem: VS 2012 should use version references and not

The problem is actually pretty simple, the wizard has upgraded my Coded UI Test project but not the project that contains the base and utility classes. There is no messing around with attributes involved.

So all we need is to reproduce the “magic” on our referenced projects.

Additionally, beware of external dependencies directly built upon version of the testing tools assembly set, you’ll have to rebuild them upon

The problem was also reported and solved here.

Upgrading (or reparing) projects manually

Disclaimer: although I’m exposing here as a commodity the implementation of this “magic” performed by VS 2012 update 1, I advise to watch closely at what is done to your own Coded UI test projects as a base. Simply compare the changes before checkin-in your proj file. That said, the following should work for you.

In the first PropertyGroup with most global properties:

<VisualStudioVersion Condition="'$(VisualStudioVersion)' == ''">10.0</VisualStudioVersion>
<VSToolsPath Condition="'$(VSToolsPath)' == ''">$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v$(VisualStudioVersion)</VSToolsPath>
<ReferencePath>$(ProgramFiles)\Common Files\microsoft shared\VSTT\$(VisualStudioVersion)\UITestExtensionPackages</ReferencePath>
<UpgradeBackupLocation />

After the last ItemGroup and before the last Import:

  <When Condition="'$(VisualStudioVersion)' == '10.0' And '$(IsCodedUITest)' == 'True'">
      <Reference Include="Microsoft.VisualStudio.QualityTools.CodedUITestFramework, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
      <Reference Include="Microsoft.VisualStudio.TestTools.UITest.Common, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
      <Reference Include="Microsoft.VisualStudio.TestTools.UITest.Extension, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
      <Reference Include="Microsoft.VisualStudio.TestTools.UITest.Extension.Firefox, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
      <Reference Include="Microsoft.VisualStudio.TestTools.UITest.Extension.Silverlight, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
      <Reference Include="Microsoft.VisualStudio.TestTools.UITesting, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
<Import Project="$(VSToolsPath)\TeamTest\Microsoft.TestTools.targets" Condition="Exists('$(VSToolsPath)\TeamTest\Microsoft.TestTools.targets')" />

Next, in the <Reference … /> node under <ItemGroup>, delete all Reference to assemblies like Microsoft.VisualStudio.QualityTools.* and Microsoft.VisualStudio.TestTools.*, except for UnitTestFramework, so leave this entry:

<Reference Include="Microsoft.VisualStudio.QualityTools.UnitTestFramework, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL" />

These evolutions actually make the ReferencePath depend on the version of Visual Studio. Cool.

Now all the references, including those from the utility project are mapped to the correct version, that is v11.0.0.0:


So in the end everything’s fine, tests are properly discovered, and run fine as they just ran with VS 2010.

Finalizing the migration

There have been deep evolutions in the testing tools between VS 2010 and VS 2012, for example vsmdi files, which used to contain test lists, are no longer supported. This is all well documented in MSDN. This may be qualified as a regression in functionality, but personally I absolutely don’t feel angry at all or even surprised, those lists needed to evolve, I’ve never been keen on them and I’m quite happy the way it is today. Now, the test tagging system is the only rightful way to categorize tests and define running lists: just as it has always been with other testing frameworks. Less ambiguity, better tools, more productivity.

You’ll surely want to add multiple categories to single test methods, it is good practice:

[TestMethod, TestCategory("Daily"), TestCategory("Nightly"), Priority(1), Description("Check the customer login")]
public void CodedUITestMethod1()

eg: some database tests may be eligible for your nightly builds.


Just in case

As a final word, just in case you have some mess in your class attributes for your Coded UI Tests, TestClass is not suitable as base class for Coded UI Tests, as you will have the following message at run time:

Result Message:    Initialization method Cdiscount.Testing.Ui.OrderProcess.Check_Customer.Customer_Login.CustomerLogin.MyTestInitialize threw exception. Microsoft.VisualStudio.TestTools.UITest.Extension.TechnologyNotSupportedException: Microsoft.VisualStudio.TestTools.UITest.Extension.TechnologyNotSupportedException: The browser  is currently not supported..

Announcing ReflectWILinks a tool to restore TFS Work Items links after a TFS to TFS migration

Hi, a few months ago I blogged about my experience with a complex TFS to TFS migration (at least complex IMHO). I promised I would publish the tool to restore links, so here it is, it’s been a long time, sorry for that…

You won’t need it every day because generally, you can grab all your work items in a single query, which I of course recommend. But just in case you can’t, or if you want to migrate multiple team projects into a single one, and let’s say, there are many links between all the work items (cross projects links), then… this is for you Smile


Merging Team projects: numbers and final thoughts

I will now conclude and share my thoughts about this whole operation. But before that let’s have a look at a few numbers.



  • 13 branches moved (only the Main branches were moved), approx 800Mb of source code in total
  • 250+ builds moved with their msbuild scripts (they were legacy, highly customized builds from TFS 2008)
  • 134.000+ work items moved (I announced 120.000, but this has raised faster than I had planned)
  • Average WI migration rate: 900 work items per hour, that is 5 full days, but a bit more in practice, because I split the workload in chunks
  • 162.000+ WI links restored, covering 134.000+ Work items
  • Average WI link restoration rate: 400 WI processed per minute



  • 28 days to plan, conceive, communicate, set up and execute the migration
  • 4 days of external help
  • a few meetings
  • 50 days of development into total to bring back the evolutions into the newly created dev branch (story here)


  • Increased productivity for everyone, having multiple team projects for the same final product was a bit confusing
    • Developers
    • Code integratorsimage
  • The starting point to have a unified documentation of all internal processes (simplified by the way), and the starting point of many other ALM improvements
  • Morale of troops: this move was the proof that managers cared about their dev infrastructure, and new comers would not find a mess


Given the costs and benefits, I think my client did the right choice. I was so much convinced it was the road to follow. I admit I spent more time on it than initially planned, but people around me were really motivated into doing this, it was the first time something big was done for the sake of industrialization without carefully calculated benefits (just like agile processes). This was also the starting point for other work streams about ALM improvements at my client’s.

If I had the chance to find the kind of articles I’ve just posted, things would have been a bit easier. I’ve given:Package by Anonymous - Package icon by Fr�d�ric Moser. From old OCAL website.

  • An overall approach and methodology for every aspect
  • A procedure and work-arounds for moving the branches
  • Tools and technical info for moving the Work Items
  • A sample tool for moving the build definitions
  • Concrete numbers to help planning

I finally encourage the readers that are interested to merge their team projects if they feel that the projects are part of a single big product (from an external point of view), it’s just like an agile practice, an investment to make things more cohesive and, with time, avoid wasting time and money, because technical things are just reflecting reality. I just hope you’ll find these pieces of information useful, feel free to ask any question, and good luck with your migrations!

Batch copying build definitions in TFS 2010

[get the source code of this sample utility]

This post is part my “merging team projects series” but can be considered independently. I’ll explain and publish a code template that will help you to batch copy build definitions from a team project to another. This is all possible thanks to the TFS API.

More than just raw copying

I’m not the first to blog about this, and you can find various small pieces of code here, or here. Why am I bothering then ? Because what I intend to do is more than just raw copying, we need also to transpose Workspaces, change build templates location, edit build process parameters, edit build templates in the source controller and check them in…


  • Regex selection of build names on command line
    • I recommend using a batch file with every possible filters in there are many
  • 3 log files in append mode (support for batch file with many calls)
  • Workspace transformations (customizable in the code)
    • Easy to report paths that are non-standard according to your own rules
  • Build process parameters transformations (customizable in the code)
  • Copy, transform and check-in files in the source controller (customizable in the code)


  • Single TFS Server (cannot migrate to another server)
  • Need to customize the C# code to get exactly what you want


My starting point what Jim Lamb’s piece of code.

You’ll need to reference a few classic TFS assemblies, including a private one: Microsoft.TeamFoundation.Build.Workflow.dll, you’ll find it in C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies. Note that this reference will force you to change the targeted framework to .NET Framework 4.0 instead of the client profile, but that doesn’t matter too much for that kind of utility.


I’ve also included an easy to use class for processing command line parameters C#/.NET Command Line Arguments Parser, many thanks to GriffonRL.

I wrote a small utility class to deal with workspaces, you’ll find it in this project.

Code Highlights

A few pieces of code that can be of interest:

// clone the build into the target team project
IBuildDefinition newDefinition = Utilities.CloneBuildDefinition(_bs, buildDef, targetTeamProject);

As I mentioned earlier, this basically calls Jim Lamb’s piece of code in order to get the build object properly duplicated.

// accessing the build process parameters stored in TFS in a serialized format
IDictionary<String, Object> processParameters = WorkflowHelpers.DeserializeProcessParameters(buildDef.ProcessParameters);

Continue reading

[TFS API] A tiny class to help with TFS Workspace creation and cleanup

I made a small helper class to make workspace programming with the TFS API a bit more straightforward. When you want to manipulate files in the TFS version control, you *have* to use a Workspace, nothing can exists or can be touched outside of a Workspace. Well, not exactly, you can always use the DownloadFile method, but this is a one file at a time, read only access you’ll get. Actually, I almost never need this, I keep using Workspaces. I have to admit I get bored at it because using workspaces means declaration, mappings and local file manipulations: no way you can work directly on the server. That is why having a few helpers can be handful.

So this is a temporary Workspace creation and auto-cleanup class. The cleanup occurs in the Dispose method so that you can use it with the using keyword without having to worry about it.

The typical use case I’m aiming is:

  • Connect to TFS
  • Create the temp workspace inside a using statement
  • Map a few folders (a Map method is provided)
  • Do some operations, because you get access to the real Workspace object
  • Don’t bother with the cleanup it is done when you leave the using block

Enough talk:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.TeamFoundation.VersionControl.Client;
using System.IO;

namespace WorkspaceDemo
    /// <summary>
    /// Tiny helper class for using temporary workspaces.
    /// </summary>
    /// <see cref=""/>
    public class TfsTempWorkspace : IDisposable
        private Workspace _workspace;
        private VersionControlServer _vcs;

        /// <summary>
        /// Creates a workspace. If a workspace with the same name exists, it will be deleted and recreated!
        /// </summary>
        public TfsTempWorkspace(VersionControlServer vcs, string workspaceName, string userName)
            _vcs = vcs;
            _workspace = _vcs.QueryWorkspaces(workspaceName, userName, Environment.MachineName).FirstOrDefault();

            if (_workspace != null)
            _workspace = _vcs.CreateWorkspace(workspaceName);

        /// <summary>
        /// Give access to many properties of the workspace
        /// </summary>
        public Workspace Workspace
                return _workspace;

        /// <summary>
        /// Adds a mapping to the workspace.
        /// The local folders and files will be created under the user TEMP directory.
        /// </summary>
        /// <param name="serverPath">Full path on server, starting from root, ex: $/MyTP/MyFolder</param>
        /// <param name="localRelativePath">A relative path inside the local workspace folder structure</param>
        /// <returns>The local full path</returns>
        public string Map(string serverPath, string localRelativePath)
            string localPath = Path.Combine(Path.GetTempPath(), _workspace.Name, localRelativePath);
            _workspace.Map(serverPath, localPath);
            return localPath;

        public void cleanDirectory()
            string rootPath = Path.Combine(Path.GetTempPath(), _workspace.Name);
            if (Directory.Exists(rootPath))

        /// <summary>
        /// All local files are deleted, and the workspace is then removed from the server
        /// </summary>
        public void Dispose()

        public static void DeleteDirectory(string target_dir)
            string[] files = Directory.GetFiles(target_dir);
            string[] dirs = Directory.GetDirectories(target_dir);

            foreach (string file in files)
                File.SetAttributes(file, FileAttributes.Normal);

            foreach (string dir in dirs)

            Directory.Delete(target_dir, false);

And a working example:
Continue reading

Merging team projects: Restoring the Work Item links

This has been the biggest issue of the migration: I knew how to migrate the Work Items, but I knew also that links between work items would only be *partially* migrated. I’ve explained the problem in this post. The only solution I could find was to write a tool myself to recreate the work item links in the destination Team Project.

What links exactly are migrated by default ?


Those links are URLs, they are static and they are migrated by default.


Changeset links are implemented in TFS with “External links”, there is no much to worry about them as long as you migrate onto the same TFS server, and that was my case.

Related links

These are links between work items themselves. Here is a schema that shows the partial links migration:


The TFS Integration Platform will only migrate links that are “contained” in the source Work Item query, and for each link, both concerned work items need to be included in the query.

Migration by range of work items

Because it was not possible in my case to perform the migration in one single query, I adopted a *chunk* strategy based on time, or, more exactly, based on IDs, which is actually the same.

The source query was like : get all work items from all the source team projects where the ID is ranging from XXX to YYY

An important thing is that the TFS Integration tools are idempotent: you may launch the same migration twice, it would not duplicate what’s already been migrated, but it detects and takes anything that is new. It just works, cool!

Time considerations

There was more than 120.000 work items to migrate, and I needed to plan the migration timing right, that is why starting the real migration several weeks before the dead line had a great value :

  • I got real performance feedback that I could use to calculate the global migration time needed
  • I was sure about what I was getting, limiting the risks as time was passing, because the job was getting done from week to week

A tool to restore links after the migration

What I like very much with TFS is its API. You can do magic with it. Yeah sometimes I complain that a particular feature is missing in the base product, yet to realize that it can be easily done by using the API. It’s just .NET programming, and with a few lines of code you can access Work Items, Source control, builds, and more.Though, the community around this API is not very big, and even if support forums are great, I find it lacks documentation, and this is where you’ll be happy to find cool guys like Shai Raiten and his TFS API blog posts series, great place to get started.

Continue reading