• fr
  • What I learned from PowerShell Toolmaking part 2: Metrics automation with Azure DevOps

    In case you missed it, here is the first article of this series

    What I learned from PowerShell Toolmaking pt1 – Gathering Metrics

    Let’s talk about automation

    We left off with a handy set of raw metrics that could be leveraged to build helpful dashboards (still love ‘em!).

    The focus of this article is to demonstrate that good software engineering practices can also be applied to PowerShell via Continuous Integration (CI) and Continuous Deployment (CD) mechanisms using Azure DevOps (formerly known as VSTS).

    The PowerShell module demonstrated within this article aims at automating DevOps tasks for the Valo Intranet-In-A-Box product: fall in love with your intranet

    What we want to achieve is the following steps:

    1. Fetch the sources from the repository
    2. Run quality tools
    3. Produce lovely dashboards
    4. Publish/deploy the module

    As a reminder, here is the module’s architecture. This time though we will point our attention to the utility scripts to achieve steps 2 and 4 respectively.

    Utility Scripts

    Workflow hypothesis

    Before going further, let’s lay down some hypotheses.

    • New code delivered by a programmer will come from an isolated feature branch
    • A pull request (PR) will be made to a target branch (in our case it will be the “develop” branch)
    • Code is merged only after being reviewed by peers into the target branch

    Putting in place a build definition

    First, let’s create a blank build definition using the new Azure DevOps user interface.

    Builds Pipelines

    Then, let’s select the type of repo and the targeted branch that will be used.

    Repository selection

    For the sake of this article we will start from an empty template.

    Using An Empty Template

    Adding tasks to the blank definition

    Here is the build definition that we are aiming at. We can see that our utility scripts will be used at the beginning and at the end of the build process. The Pester task can be obtained from the Visual Studio Marketplace while the Publishing tasks are provided OOTB . Before moving forward, keep in mind that it is a safe practice to save (without necessarily queuing) your build definition between each step.

    Targeted Pipeline Definition

    Installing Required PS Modules

    Add a PowerShell task to your build definition.

    1. Give a meaningful name to your task (optional)
    2. Type in the File Path to tell Azure DevOps which script to run
    3. Type in the script’s path within the codebase

    Install Required PS Modules

    The script being run is quite simple. It first installs the NuGet Package provider on the currently running agent to ensure having access to our required module.

    In the case of my module, I need PSScriptAnalyzer for code linting and AzureRm

    # This script installs dependencies required by the Toolkit. 
    # It is intended to be used by the Azure DevOps Agent and 
    # is the reason why Module installations are scoped to current User.
    Install-PackageProvider -Name NuGet -Force -Scope CurrentUser
    Install-Module -Name PSScriptAnalyzer -Force -Scope CurrentUser
    Install-Module -Name AzureRm -Force -Scope CurrentUser -AllowClobber
    

    Pester Test Runner

    The Pester Test Runner task, made by Black Marble, can be obtained for free from the Visual Studio Marketplace.

    1. Give a meaningful name to your task (optional)
    2. Specify the folder containing your Tests. I’m lazy and I just point at the root of the sources folder
    3. Specify the test results output file path
    4. Specify the code coverage output file path

    In the advanced section, enforce version 4.0.8 and do not forget to check the Force the use of a Pester Module shipped within this task or the version won’t really be enforced… It cost me many precious build minutes to find out about that one 😉 .

    Pester Test Runner

    Publish Tests Results

    This task already comes pre-configured for most use cases and will oversee the production of the test results dashboard which is our first metric.

    1. Give a meaningful name to your task (optional)
    2. The test result format should be NUnit
    3. Default value can be left as is if you’ve followed this blog post so far. Refer to your configuration in the Pester task for your needs.

    Publish Test Results

    Publish Code Results

    This task also already comes pre-configured for most use cases and it will be in charge of creating our second dashboard, hype!

    1. Give your task a meaningful name (optional)
    2. Select JaCoCo as a code coverage tool
    3. Refer to your Pester task configuration for the proper file name. I recommend using VSTS’s variables to reference the file path.

    Publish Code Coverage

    Publish to PowerShell Gallery

    Finally, the last step of our build process is to deploy the module to either a private environment which can be consumed internally in your organization or like in this case, publicly for sharing neat functionalities with the community.

    1. Give your task a meaningful name (repetition is key here)
    2. Type in the path to the PowerShell script
    3. The script will use an argument which can be defined as a private variable of your CI pipeline. Don’t worry, I’ll cover this part too.

    Publish To Powershell Gallery

    Here is the content of the script used to publish to PowerShell gallery.

    [CmdletBinding()]
    Param (
        [ValidateScript({-not([string]::IsNullOrEmpty($_))})]
        [Parameter(Mandatory=$true)]
        [string]$NuGetApiKey
    )
    
    $moduleName = "Nexus.Valo.Modern"
    Push-Location $PSScriptRoot
    
    try {
        Write-Output "Creating module staging folder"
        $module = New-Item -Name $moduleName -Type Directory
    
        Write-Verbose "Copying module scripts to staging folder"
        Get-ChildItem -Recurse -Path ".\Sources" | ForEach-Object {
            Write-Verbose "Copying $($_.FullName) to $($module.FullName)"
            Copy-Item $_.FullName -Destination $module.FullName -Force
        }
    
        Write-Output "Publishing module to PowerShell Gallery"
        Publish-Module -Path $module.FullName -NuGetApiKey $NuGetApiKey
    
    } finally {
        if(Test-Path $moduleName) {
            Write-Warning "Removing module staging folder"
            Remove-Item $moduleName -Force -Recurse
        }
    
        Pop-Location
    }

    And now the only missing piece is having our pipeline’s private variable that will hold the NuGet API Key which can be obtained from PowerShell Gallery right under your profile.

    Api Keys

    From the pipeline menu, press Variables. Make sure that you’ve selected Pipeline variables. Then press add. Give the same name as the Publish-NexusValo.ps1 script argument and do not forget to check the little padlock at the end to ensure that your API key stays a secret to everybody!

    Pipeline Variable

    We are officially done with the plumbing, those still with us are officially DevOps warriors!

    And now… dashboards!

    It is time to see those metrics presented in a user-friendly way that can give real insights on our codebase.

    Now fire away by pressing the Queue button, wait for an agent to be available and on your build to complete.

    Once the build is complete, we can toggle between tests and code coverage, download the produced artifact and even take a peek at the agent’s log. Azure DevOps is really a nice tool in order to keep an eye on your codebase evolution.

    Ci Pipeline Summary

    Ci Pipeline Tests

    And on PowerShell Gallery:

    Published Module On PowerShell Gallery

    Conclusion

    There you have it friends, the completed loop. In a world of ever-increasing complexity, having good software engineering practices such as systematic test writing for new lines of code, code analyzers, continuous integration and deployment, is a must. It saves organizations time and money but more importantly it saves a lot of headaches to your development team.

    Stay tuned for the upcoming series of articles which will cover how we, at Nexus, ensure a software engineering process aiming at standardizing, automate and accelerate Valo Modern intranet deployments by leveraging PowerShell and Yeoman generators.

    Let's get acquainted
    Want to learn more and/or collaborate with us?

    We can't wait to hear from you.