Thursday, January 3, 2013

Continuous Integration, how do I declare commonly derived MSBuild metadata?

So, following some recent posts on MSBuild-NUnit integration, I went back and reviewed some of the implementation and wondered if there were easier ways to reference commonly derived metadata.

The motivation for this post stems from issues I encountered with long paths and paths that contain spaces; eg "C:\Users\johnny g\Documents\Visual Studio 2012\Projects\UnitTests\Example.Continuous\..\Example.UnitTests\bin\debug". Both present particular logistic problems when commands require multiple paths as inputs.

So, what follows are a number of refinements I picked up in my exploration to address some of these issues.

Custom ItemGroup versus ProjectReference


In previous examples, I prescribed custom ItemGroups, like TestAssembly below

<ItemGroup>
  <TestAssembly Include="Example.UnitTests.dll">
    <WorkingDirectory>
      $(MSBuildProjectDirectory)\..\Example.UnitTests\$(OutputPath)
    </WorkingDirectory>
  </TestAssembly>
</ItemGroup>

I prefered this since it would not collide nor compromise any other build metadata. However, it does produce some visual artifacts in Visual Studio, and requires a manual edit of the project file to modify.

Figure 1, custom TestAssembly item appears as missing dll.

A better solution, which would resolve these issues and facilitate adding test projects, is to leverage the well-known ItemGroup ProjectReference.

Adding ProjectReference Items is as simple as using "Add Reference" feature from Visual Studio. However, to add custom metadata for these Items will require a little finesse, but that is what follows below!

Extending ItemGroup metadata


MSBuild permits the extension of existing metadata through the use of ItemDefinitionGroups. This will work for custom Items, such as TestAssembly, or well-known Items, like ProjectReference.

When we add a project reference via Visual Studio, we end up with a ProjectReference Item

<ItemGroup>
  <ProjectReference Include="..\Example.UnitTests\Example.UnitTests.csproj">
    <Project>{eaac5f22-bfb8-4df7-a711-126907831a0f}</Project>
    <Name>Example.UnitTests</Name>
  </ProjectReference>
</ItemGroup>

To add custom metadata, we simply define an ItemDefinitionGroup.

For example, let's define both a WorkingDirectory, and an OutputFile for our test libraries,

<ItemDefinitionGroup>
  <ProjectReference>
    <WorkingDirectory>%(RootDir)%(Directory)$(OutputPath)</WorkingDirectory>
    <OutputFile>%(Filename).dll</OutputFile>
  </ProjectReference>
</ItemDefinitionGroup>

Essentially, for each ProjectReference Item, MSBuild will add custom metadata WorkingDirectory, composed of well-known per-Item metadata RootDir, Directory, and project defined OutputPath. Similarly, OutputFile is a simple concatenation of per-Item metadata Filename and literal ".dll".

For a full list of well-known Item metadata, take a peek at this MSDN reference page.

From relative to absolute paths


When a project exists in a sibling directory we will often end up with relative paths which, when squeezing the most out of a command line, chews up valuable character space. We can mitigate this by converting unevaluated relative paths to absolute paths.

MSBuild exposes a task for path conversion (ConvertToAbsolutePathTask), but we may also inline static method calls; eg $([System.IO.Path]::GetFullPath(C:\Temp\..\Some.Other.Path)). For our purposes, we'll use the latter since we have only a handful of paths to define and the syntax is easy enough to manage.

Really, we only require a WorkingDirectory, a resources folder, and an artifact folder. We have already defined an absolute path WorkingDirectory in our ItemDefinitionGroup, so that leaves resources and artifacts.

Since our tools and output folders are not likely to vary from Item to Item, let's define some project properties to capture these paths.

<PropertyGroup>
  <TestResultsFolder>
    $([System.IO.Path]::GetFullPath($(MSBuildProjectDirectory)\..\TestResults\))
  </TestResultsFolder>
  <ResourcesFolder>
    $([System.IO.Path]::GetFullPath($(MSBuildProjectDirectory)\..\Resources\))
  </ResourcesFolder>
</PropertyGroup>

For example, these will evaluate from "C:\Projects\Example.Continuous\..\Resources\" to "C:\Projects\Resources\". With these root definitions defined, we can then define other properties and metadata like

  <PropertyGroup>
    <NUnitExe>$(ResourcesFolder)NUnit-2.6.0.12051\nunit-console.exe</NUnitExe>
    <NUnitExe-x86>$(ResourcesFolder)NUnit-2.6.0.12051\nunit-console-x86.exe</NUnitExe-x86>
    <NUnitTestResultsFolder>$(TestResultsFolder)NUnit\</NUnitTestResultsFolder>
  </PropertyGroup>

  <ItemDefinitionGroup>
    <ProjectReference>
      <NUnitTestResultsFile>$(NUnitTestResultsFolder)%(OutputFile).xml</NUnitTestResultsFile>
    </ProjectReference>
  </ItemDefinitionGroup>

Escaping paths with spaces


As cited earlier, paths with spaces in them may present particular problems depending on the form of our Exec task. For instance, consider

paths
  • $(NUnitExe), C:\Users\johnny g\Documents\Visual Studio 2012\Projects\UnitTests\Resources\nunit-console.exe
  • %(ProjectReference.OutputFile), Example.UnitTests.dll
  • %(ProjectReference.NUnitTestResultsFile), C:\Users\johnny g\Documents\Visual Studio 2012\Projects\UnitTests\TestResults\NUnit\Example.UnitTests.dll.xml
  • %(ProjectReference.WorkingDirectory), "C:\Users\johnny g\Documents\Visual Studio 2012\Projects\UnitTests\Example.UnitTests\bin\debug"
 and Exec task

<Exec
  Command="$(NUnitExe) %(ProjectReference.OutputFile) /nologo
    /result=%(ProjectReference.NUnitTestResultsFile)"
  WorkingDirectory="%(ProjectReference.WorkingDirectory)" />

this task will fail for two similar reasons. First, WorkingDirectory is escaped with quotation marks; this poses problems for Exec task's WorkingDirectory parameter, which expects a simple unenclosed path. Second, nunit-console.exe will fail because its result specification is not escaped with quotation marks; it will misinterpret any path spaces as new parameter specifications and blow up.

Since paths may be used in any number of ways during build (for instance arbitrary path concatenation, or various embedded parameters, or contexts that require or prohibit quotations marks), I have settled on simply defining raw paths in my properties and metadata, and explicitly escaping them where utilised.

To resolve our Exec task above, let's consider revising path
  •  %(ProjectReference.WorkingDirectory), C:\Users\johnny g\Documents\Visual Studio 2012\Projects\UnitTests\Example.UnitTests\bin\debug
 and our task

<Exec
   Command="$(NUnitExe) %(ProjectReference.OutputFile) /nologo
     /result=&quot;%(ProjectReference.NUnitTestResultsFile)&quot;"
   WorkingDirectory="%(ProjectReference.WorkingDirectory)" />

 The only real take away is that we must Html encode our quotation marks, so "" becomes &quot;&quot; .

Conclusion


Well, that about wraps up some small MSBuild tidbits. These little morsels should enable us to write reliable build tasks in a clear and concise manner.

Resources

StackOverflow, how to evaluate absolute paths
MSDN MSBuild Well-known Item Metadata
MSDN ConvertToAbsolutePath task





Thursday, December 27, 2012

Continuous Integration, how do I generate NUnit output files?

A quick addendum to my NUnit integration post. To support result file generation (for build server integration), we make a few simple modifications to our MSBuild project file.

Let's assume a pared down version of our previous example, we have

  1. Example.Continuous, our build project
  2. Example.UnitTests, a project containing tests
  3. NUnit in a Resources folder

Example.Continuous declares build target AdditionalTasks, and the following property and item groups.

  
    "$(MSBuildProjectDirectory)\..\Resources\NUnit-2.6.0.12051\nunit-console.exe"
  
  
    
      $(MSBuildProjectDirectory)\..\Example.UnitTests\$(OutputPath)
    
  
  
    
    
    
    
      
    
    
    
  
What we want to do is leverage NUnit's result switch and specify output files for each test run. To do so, we will introduce both a new property for the results path and additional item paths to specify output files. We are being very explicit here to better support spaces in folder paths (eg default project folders are located under "Visual Studio 2012" folder).

This is our new project file after our modifications!

  
  
    "$(MSBuildProjectDirectory)\..\Resources\NUnit-2.6.0.12051\nunit-console.exe"
    $(MSBuildProjectDirectory)\..\TestResults\NUnit\
  
  
    
      $(MSBuildProjectDirectory)\..\Example.UnitTests\$(OutputPath)
   "$(TestResultsFolder)Example.UnitTests.dll.xml"
    
  
  
    
    
    
    
      
    
    
    
  

Thursday, May 17, 2012

Continuous Integration, how do I integrate post build tasks?

Everyone these days understands the benefits of automated continuous integration builds. The ability to continually build a code base and execute various automated tasks to evaluate the integrity of each iteration is invaluable. Yet there is still some confusion as to how exactly to implement this, or is something we are constantly re-inventing.

Case in point, today I find myself researching unit test integration into our automated build solution.

My friend Kent Boogaart posted a great article about continuous integration. His solution is to integrate these tasks into the solution, and have MSBuild execute them. This has a number of benefits, including portability (I have used CruiseControl.Net previously, and Jenkins currently) and local execution. My primary role is developer, so I am rather partial to this last point; if we are responsible for failures in the build process, it is paramount that we are able to reproduce the process locally.

Now, Kent's article provides a great overview of the process, but I always find myself rooting around for resources to help implement an MSBuild integration (and re-educating myself on Task declaration). So this article will detail some of these specifics.

Overview


As an overview then, this article will take a solution that has some passing tests and some failing tests. By the conclusion, this solution will build and execute unit tests.

Setup


For this article, I have created a simple solution;
  1. Create a solution, Example.UnitTests,
  2. Create a Class Library, call it Example.UnitTests,
  3. Create a Class Library, call it Example.IntegrationTests,
  4. Create a folder, Resources, under solution directory
With Example.UnitTests, add a code file Tests.cs

using NUnit.Framework;
namespace Example.UnitTests
{
    [TestFixture]
    public class Tests
    {
        // will always pass, should have at least one of these!
        [Test]
        public void Test_Pass() { }
    }
}

With Example.IntegrationTests, add similarly named code file Tests.cs

using NUnit.Framework;
namespace Example.IntegrationTests
{
    [TestFixture]
    public class Tests
    {
        // will always fail, there is always at least one of these!
        public void Test_Fail()
        {
            Assert.Fail();
        }
    }
}

Example.IntegrationTests should only build on Release mode.

Figure 1, Example.IntegrationTests will not build under Debug mode, Release mode only

This may be accomplished by opening ConfigurationManager, selecting Debug solution configuration, and deselecting the Build checkbox next to Example.IntegrationTests project (see Figure 1 above).

With Resources folder, copy a working set of NUnit console. We will use this execute our unit tests.

Strategy


With this solution, it should be plain to see we have a set of tests we would like to execute in Debug mode only (think quick in-proc tests that verify behaviour at a very fine and granular level for a QA environment) and a set of tests we would like to execute in Release mode only (think long-running system-integration tests that verify behaviour at a coarse use-case level for a UAT or Pre-Prod environment).

Ideally, we want to be in the practice of executing unit tests as often as possible. However, forcing a developer to run these tests on every build is less than ideal or may even be prohibitively expensive.

Our approach then will be to create additional build profiles that target our standard Debug and Release modes, and execute our test suites selectively. Developers will be held to an honour system of running tests prior to major commits, but our continuous integration environment will always run these tests.

When we are through, our build environment will be able to
  • Build Debug and execute unit tests,
  • Build Release and execute unit and integration tests,
Of course, these build profiles will also be available to our developers, so that they may verify the integrity of their commit, or at the very least reproduce functional-related test failures in their own environment post build-fail.


Adding Build Profiles


Build profiles are tricky things. Adding and maintaining profiles can be cumbersome and error prone (Visual Studio does not auto-magically add custom profiles to new projects that we add, and is a manual step). Fortunately, we do not need existing libraries or future libraries to implement our custom build configuration.

For our purposes an empty light-weight configuration is all that we need. To do so,

  1. Open ConfigurationManager,
  2. From Active solution configuration: dropdown, select New...,
  3. Enter a configuration name, one that starts with "Debug". For this example, I have chosen DebugContinuous,
  4. From Copy settings from: dropdown, select Empty,
  5. Uncheck Create new project configurations if it is not already in an unchecked state

Figure 2, a minimal Debug continuous integration build configuration

Now may be a good time to create the Release profile as well. Same steps, simply create a profile with a name that starts with "Release" - if you're stuck, try ReleaseContinuous!

This naming requirement may seem odd, but we will be depending on MSBuild's ability to detect similar profiles based on name to target the correct mode for our solution. Basically, when our build environment invokes DebugContinuous, any projects that implement this mode exactly will build in DebugContinuous (more on this in a bit), and projects that do not will build in a mode that most closely resembles this mode (ie all of our existing projects). For our QA builds, this means Debug mode. When a suitable match cannot be found, MSBuild defaults to Release - so it is not the end of the world.

Adding Continuous Integration Project


Now that we have a (solution-wide) build profile for our continuous integration environment, we now need a place to throw in our unit test task. We could simply use any existing project, but for very large solutions, it makes better sense to consolidate our optional continuous integration tasks into a single place that is separate from our test code, so let's add a new project.

Add a new Class Library Project, Example.ContinuousIntegration. Delete the default Class.cs file.

All other projects are fine as they are, implementing only Debug and Release build modes. This one specific project however, will contain conditional elements that require the continuous build mode. So let's add our continuous build modes to Example.ContinuousIntegration. It is very similar to adding a solution-wide profile,
  1. Open ConfigurationManager,
  2. From Configuration dropdown beside our project, select New...,
  3. Enter your debug continuous integration build mode name. As with the rest of this example, I have used DebugContinuous,
  4. From Copy settings from: dropdown, select Empty,
  5. Uncheck Create new solution configurations if it is not already in an unchecked state
Figure 3, a minimal Debug continuous integration build configuration

Once we have defined our build modes, it pays to review each build configuration. Back in ConfigurationManager, iterate through each solution configuration. Ensure, that when Debug is active, Example.ContinuousIntegration does not build and is set to Debug configuration. Ensure that when DebugContinuous is active, all projects are in Debug mode and Example.ContinuousIntegration is set to build with DebugContinuous. Do likewise for Release modes.

One final check before we modify our project file directly.

  1. Open Project Dependencies,
  2. From Projects: dropdown, select Example.ContinuousIntegration,
  3. Check all project boxes,

Figure 4, add dependencies to continuous integration project to ensure a convenient build order


This creates an artificial dependency between our integration project and every other project in the solution. This ensures it builds last. While not strictly necessary, it makes it easier to debug build issues that we may encounter later on.

Now let's crack this sucker open. To edit this project through Visual Studio, first unload the project, and then edit it. Alternatively, use an external program (like Notepad.exe) to modify Example.ContinuousIntegration.csproj; when Visual Studio regains focus, it will detect modifications and prompt to reload.

When you first open it up, it should look something like this,


  
    
      Debug
    
    ...
  
  
    ...
  
  
    ...
  
  
    
    ...
  
  
    
  

What we will do next is add another build target that will contain our unit test execution task. To tidy up the declaration, we will also define some build properties and metadata. Our project file should now look a little something like this,

  
    bin\Debug\
  
  
    bin\Release\
  
  ...
  
    "$(MSBuildProjectDirectory)\..\Resources\nunit-console.exe"
  
  
    
      $(MSBuildProjectDirectory)\..\Example.UnitTests\$(OutputPath)
    
    
      $(MSBuildProjectDirectory)\..\Example.IntegrationTests\$(OutputPath)
    
  
  
    
    
      
    
    
    
  

A few things to note,
  1. Inclusion of AdditionalTasks as part of DefaultTargets attribute,
  2. Modification of OutputPath property, from default bin\DebugContinuous to bin\Debug, and
  3. Conditional inclusion of Example.IntegrationTests for ReleaseContinuous only
The rest of it is fairly straightforward. And I'm knackered.


Resources

MSDN MSBuild Reference
MSDN MSBuild Exec Task Reference
MSDN MSBuild Reserved Property Reference
Kent Boogaart's Blog, fail early with full builds
Peter Provost's Blog, custom metadata
Kevin Dente's Blog, run all tests, fail on at least one error

Monday, April 25, 2011

Oi, oi, oi ...

Been awhile. A few months back I started working on some material for some logging-based posts, but got side tracked by work and other stuff too. Still on the radar, but also ramping up on some Silverlight - especially the design aspects!

While searching for Silverlight related tutorials and sites, found this gem .toolbox. It's a clever site with loads of content that I am slowly working through. As a dev, the Expression Blend tutorials are a good way to get familiar with the product. I am especially interested in using Blend to prototype faster so that I may elicit users feedback sooner. I am also enjoying myself with the design sessions, and recommend these to any developer interested in User Interface design work.

.toolbox user avatar for johnny_g


Wednesday, October 13, 2010

I <3 palindromes ...

w00t! Stack Overflow!


Okay, so maybe not so unique, but pretty cool nonetheless!

Monday, September 27, 2010

Unit testing, how do I mock a signature with ref or out?

So, recently I had the distinct pleasure of mocking a service with method members that contained ref parameters. I thought this rather odd (I have never been a fan of 'in-place' modifications or returns), but not particularly special. That is, not until I came to unit test consumers of the service.

// immutable third-party service interface
public interface ISubmitMessage
{
    // ugly method
    void Submit (string username, string password, ref byte[] message);
}

// our poor poor consumer that has to interface
// with big bad mean service above
public class SubmitAdapter
{
    void Process (ISubmitMessage service)
    {
         // 1. set username and password from configuration
         string username = string.Empty;
         string password = string.Empty;

         // 2. generate byte-encoding of a string message
         byte[] messageBytes = null;

         // 3. invoke service
         service.Submit (username, password, ref messageBytes);

         // 4. inspect messageBytes return value for
         // success\failure
    }
}

Depending on our mocking solution, ref and out method parameters may or may not be supported. From personal experience many mocking frameworks do not support ref and out parameters. My current mock solution of preference is Moq, and with Moq v4.0.10827 (beta) we cannot verify pass-in and pass-out parameters out-of-box.

[Test]
public void Test_Process ()
{
    Mock<ISubmitMessage> service = 
        new Mock<ISubmitMessage> (MockBehavior.Strict);

    string username = "some.username";
    string password = "some.password";
    byte[] messageIn = null;
    byte[] messageOut = new byte[] { 1 };

    // ideally, we would like something similar to this
    service.
        Setup (
        s => s.Submit (
        username,
        password,

        // where we inspect on way in, and define a return value
        ref It.IsRef<byte[]> (d => Equals (d, null)).Returns (messageOut)));

    SubmitAdapter adapter = new SubmitAdapter ();
    adapter.Process (service.Object);

    service.VerifyAll ();
}

Unfortunately, this feature set does not quite exist yet - though as we shall soon see, we do not necessarily need direct support.

The key to mocking ref and out method parameters is understanding our requirements as a consumer and our requirements as a producer - and while disjoint, they can be satisfied via transform. By example, our consumer expects a functional implementation of a specific interface,

public class SubmitMessageStub : ISubmitMessage
{
    public void Submit (string username, string password, ref byte[] message)
    {
        // ??? 
    }
}

our test method expects a contract we can mock,

// a mock-friendly "normalized" version of original interface
public interface ISubmitMessageMock
{
    // similar signature to source method ISubmitMessage.Submit,
    // note new response object,
    SubmitResponse Submit (string username, string password, byte[] message);

    // new response class, a super-set of data expected to be 
    // returned
    public class SubmitResponse
    {
        // contains return message value
        public byte[] Message { get; set; }
    }
}

Jumping back to our consumer, it is a relatively trivial matter to transform from original to mock interface

public class SubmitMessageStub : ISubmitMessage
{

    // simple constructor that accepts a mock-friendly implementation
    private readonly ISubmitMessageMock _mock = null;
    public SubmitMessageStub (ISubmitMessageMock mock)
    {
        _mock = mock;
    }

    public void Submit (string username, string password, ref byte[] message)
    {
        // NOTE: do NOT add any additional parameter checking
        // or verification. our responsibility is to perform a
        // transform and delegate, let expectation testing occur
        // in caller,

        // 1. transform to mock
        ISubmitMessageMock.SubmitResponse response = _mock.
            Submit (username, password, message);
        // 2. reverse-transform
        message = response.Message;
    }
}

Our test setup now looks like

[TestMethod]
public void Test_Process()
{
    Mock<ISubmitMessageMock> serviceMock =
        new Mock<ISubmitMessageMock> (MockBehavior.Strict);

    // proper setup, you can do this with Moq!
    string username = "some.username";
    string password = "some.password";
    byte[] message = null;

    // we control out parameters here through internal
    // response object
    ISubmitMessageMock.SubmitResponse serviceMockResponse =
        new ISubmitMessageMock.SubmitResponse 
        {
            Message = new byte[] { 1, }, 
        };

    serviceMock.

        // we control in parameters here through standard
        // Moq inspection
        Setup (s => s.Submit (username, password, message)).
        Returns (serviceMockResponse);

    SubmitAdapter adapter = new SubmitAdapter ();

    // simply wrap mock-friendly version with our stub!
    adapter.Process (new SubmitMessageStub (serviceMock.Object));

    // and finally verify service interactions!
    serviceMock.VerifyAll ();
}

Wednesday, July 14, 2010

Unit testing, how do I unit test a Singleton?

Well, as with our WebService client sample, this question is a misnomer. More often than not we are interested in unit testing consumers of a Singleton than unit testing a Singleton's functional set - and depending on implementation, this may not be as straightforward as we think.

Consider

// a fairly typical Data Access Layer implementation (DAL). have actually
// encountered this on-site [shudders].
public static class DatabaseConnectionFactory
{
    public static IDbConnection GetDatabaseConnection ()
    {
        // 1. get connection string from config
        string connectionString = null;

        // 2. create connection
        SqlConnection connection = null;
        connection = new SqlConnection (connectionString);
        connection.Open ();

        // 3. some other custom stuff
        // 4. return connection
        return connection;
    }

    public static IDbCommand GetDatabaseCommand (
        string commandString,
        IDbConnection connection)
    {
        // 1. set custom stuff like transaction and timeout
        // 2. return command
        return new SqlCommand (commandString, connection);
    }
}

// embedded DAL logic in business tier - anyone else vomit in their 
// mouth just a little? - however what is especially offensive is the
// direct calls to data tier via static class DatabaseConnectionFactory
public class AppointmentBusinessObject
{
    public const string CommandString_LoadById_OneParameter = 
@"SELECT * 
FROM Appointments 
WHERE AppointmentId = {0}";

    public void LoadById (long appointmentId)
    {
        // 1. create Sql command string,
        string commandString = 
            string.Format (
            CommandString_LoadById_OneParameter,
            appointmentId);

        try
        {
            // 2. get connection,
            IDbConnection connection = 
                DatabaseConnectionFactory.
                GetDatabaseConnection ();

            // 3. get command,
            IDbCommand command = 
                DatabaseConnectionFactory.
                GetDatabaseCommand (commandString, connection);

            // 4. execute command
            Reader reader = command.ExecuteReader ();

            // 5. read contents
        }
        finally
        {
            // 6. close connection,
            if (connection != null) 
            {
                connection.Close ();
            }
        }

    }

}

Now, there are many reasons why this is a poor design, not the least of which is that we are locked into a single connection and a single implementation. We are forced to duplicate this source if any of these parameters need to change - and believe me this has happened!

If this were not reason enough to contemplate a complete revision, our ability to unit test is also impaired. Specifically, we cannot unit test or verify LoadById without hitting a datastore! In fact, we cannot unit test any class or method that invokes LoadById without hitting a datastore.

So what can we do?

Well the first thing to do is identify the actual problem - and here it appears to be a tight coupling between our business and data tiers. If we were to abstract DatabaseConnectionFactory then the result would be a loosely-coupled and more flexible system. To this end, I would suggest introducing an interface (that defines DatabaseConnectionFactory's existing members) and consuming this wherever possible.

For example,

// a simple data store interface, exposes full functional set
// of existing DatabaseConnectionFactory
public interface IDatabaseConnectionFactory
{
    IDbConnection GetDatabaseConnection ();
    IDbCommand GetDatabaseCommand (
        string commandString, 
        IDbConnection connection);
}

// we want this to be as low-impact as possible, so new
// interface defines members that already exist. sadly, 
// static classes cannot implement interfaces directly, 
// and so DatabaseConnectionFactory must be made an instance 
// class. while this requires a little deft maneuvering, 
// this may be accomplished with *zero* impact to existing 
// consumers! yay!
//
// 1. remove static key word from class declaration, this makes
// class instance-based
public class DatabaseConnectionFactory : IDbConnectionFactory
{

    // 2. remove static key word from members, this simply
    // makes existing method implementations instance-based
    public IDbConnection GetDatabaseConnection () { ... }
    public IDbCommand GetDatabaseCommand (
        string commandString,
        IDbConnection connection) { ... }

    // 3. define *new* static members that delegate to
    // instance-based members, this ensures that existing 
    // consumers do not break
    public static IDbConnection GetDatabaseConnection ()
    {
        // instantiate new factory every call for clarity,
        // potential optimization in declaring a static
        // lazy-loaded instance member, and delegating
        // to that instead
        DatabaseConnectionFactory factory = 
            new DatabaseConnectionFactory ();
        return factory.GetDatabaseConnection ();
    }
    public static IDbCommand GetDatabaseCommand (
        string commandString,
        IDbConnection connection) 
    {
        DatabaseConnectionFactory factory = 
            new DatabaseConnectionFactory ();
        return factory.GetDatabaseCommand (commandString, connection);
    }
}

// embedded DAL logic in business tier - vomitting just a little less -
// less offensive now we reference implementation-independent datastore
// definition
public class AppointmentBusinessObject
{
    // still tightly coupled to Sql-compliant datastore
    // but manageable.
    public const string CommandString_LoadById_OneParameter = 
@"SELECT * 
FROM Appointments 
WHERE AppointmentId = {0}";

    // 1. declare new factory member. this is our *dependency*
    // it *must* be fulfilled for class to operate successfully
    private readonly IDbConnectionFactory _factory = null;

    // 2. expose new parameter constructor, permitting consumers
    // of *this* class to specify an appropriate datastore
    public AppointmentBusinessObject (IDbConnectionFactory factory)
    {
        _factory = factory;
    }

    // 3. expose new parameterless constructor, this preserves
    // existing consumers who may not be "up to speed" regarding
    // this new-fangled connection specification. we also preserve
    // previous operating expectations by defaulting to ... 
    // "default" connection factory, so at worst, we deliver
    // *EXACTLY* same behaviour as before
    public AppointmentBusinessObject ()
        : this (new DatabaseConnectionFactory ())
    {
    }

    // 4. consume!
    public void LoadById (long appointmentId)
    {
        // ...
        try
        {
            IDbConnection connection = 
                _factory.GetDatabaseConnection ();
            IDbCommand command = 
                _factory.GetDatabaseCommand (commandString, connection);
            // ...
        }
        finally
        {
            // ...
        }
    }

}

So, where is the payoff exactly? Well, for one if we now wish to change datastore implementation (say to a MySql, or Oracle, or some other Sql-compliant datastore) we implement a new class and may swap between the two when desired. We also gain the ability to load objects from two or more datastores at the same time!

As a pleasant side-effect, we are also able to unit test LoadById directly, and any consumers that permit datastore specification.

// test business logic without hitting datastore! yay!
[TestMethod]
public void Test_LoadById ()
{
    IDbConnectionFactory mockFactory = null;
    // instantiate mock with expectations
    AppointmentBusinessObject appointment = 
        new AppointmentBusinessObject (mockFactory);
    appointment.LoadById (1024);
    // verify results
}