CDS – Basic ALM process

In today’s article, I will explain a Basic ALM Process inside of CDS. This article is split into 3 Areas

  • Explaining ALM
  • Goal process
  • Implement the process

For the implementation, I will use Azure DevOps and the PowerApps BuildTools.

Explaining ALM

The application lifecycle describes the complete circle an application normally does over and over again. The following image describes which elements are included in the Application Lifecycle.

Application Lifecycle
Application Lifecycle

The Management of this lifecycle is called ALM, Application Lifecycle Management. In the ALM process, each of those 5 aspects should be included. However, do we normally focus on only 3 of those steps when we talk about ALM. Those 3 are: Test, Deploy and Maintain.

In the world of Dynamics 365 MDA or the Power Platform/CDS, those steps could be automated.

Goal process

The goal of this article is to describe a basic ALM process in CDS.

What should this Process look like?

The basic process could look like this:

Complete basic ALM process
Complete basic ALM process

The QG’s, Quality Gates, are in a gray color because we will not talk about those in this article. I will go into details regarding different QG’s (like Solution Checker and automated Tests) in separate articles.

I split this rather huge process into 3 sub-processes. Those are:

  • Create Export from Dev
  • Build Managed Solution
  • Release to Test

In the following sections, we will talk about each of them and what they will do. In the next chapter, we will see how to implement those processes in Azure DevOps with pipelines.

Create Export from Dev

Sub-Process - Create Export from Dev
Sub-Process – Create Export from Dev

In this process, we would like to export an unmanaged solution from our development environment, extract the Zip-file and store the sources in our repository.

This process is following the recommendation and best practice from Microsoft. Microsoft’s approach, and the approach they recommend to everyone, is the “Source-Controle centric” approach. This means that you always should store a functional version of your Solution in your Source-Controle. I do see the following main reasons for this approach:

  • Recover your development environment
  • Merge several development environments
  • See historic changes in a solution

Build Managed Solution

Sub-Process - Build Managed Solution
Sub-Process – Build Managed Solution

The Solution we stored with the first sub-process will be changed to a Managed solution in this sub-process. To do so we have to import the solution to a JIT Build (Just-In-Time Build) environment, export it as Managed and store the Zip-File as an artifact inside of DevOps.

We do need the extra JIT Build environment because this process will run independently from the first one. This could result in having a different version in our development environment than the version we would like to package. In the best case, we would create a blank new environment when starting this process. I will not go into detail in this article but will publish another one specifically on this topic.

Release to Test

Like you can see, this process will be a rather simple one. It will just take the last artifact there is in DevOps and deploy it to test. Obviously there has to be at least another one for deploying to Prod, but there might be more QG’s or other environments in between.

Implement the process

Finally, we are coming to the really interesting part of this article. We will answer the Question:

How do we do this in Azure DevOps using Pipelines?

This chapter will also be split into 3 sections, each of those will cover the implementation of our 3 sub-processes.

As mentioned earlier we will use the PowerApps BuildTools for Azure DevOps from Microsoft. There are several other tools out there (for example the Dynamics 365 Build Tools from Wael Hamez), but I will focus on those from Microsoft directly.

Note: The Microsoft first-party tools are still in preview.

Create export from Dev

This pipeline is a Build-pipeline and will include the following steps

Complete Sub-process - Create export from Dev
Complete Sub-process – Create export from Dev

First of all, we have to create the pipeline. To do this you open “Pipelines” in the menu on the left and then press the “new pipeline” button. If you already have another pipeline it will be in the upper right corner otherwise it will be in the middle of the screen. In the next screen, you have to choose “use the classic editor”.

Creating a pipeline
Creating a pipeline

On the second screen, you can leave the defaults like they are. On the third and last screen, you choose “empty Job” at the top of the page.

1 – PowerApps Tool Installer

Every pipeline that uses the PowerApps BuildTools has to install them as a first step. This ensures that they are really installed on the Agent.

2 – PowerApps Publish Customizations

As a second step, we do publish all customizations. In this step, you only have to choose your connection.

Sidenote: If you did not already have a connection to your dev environment you should create one. You have to choose a Connection of the type "Generic" and fill it out like on the following picture.
Creating a new Connection of type "Generic"
Creating a new Connection of type “Generic”

Update 2020-05-28: Yesterday Colin Vermander released a blog Post about changes in the newest update of the PowerApps BuildTools (version 0.3.3). From now on you are not limited to username/password connections. You can also configure Non-Interactive Users/Service principal connections.

3 – PowerApps Export Solution

Now we have to export the solution as unmanaged so that we can later store it in our repo. To achieve that we do add a step called “Export Solution” with the following configuration.

Export Solution unmanaged
Export Solution unmanaged

Solution Name should be:

$(SolutionName)

Solution Output File should be

$(Build.ArtifactStagingDirectory)\$(SolutionName).zip

4 – PowerApps Unpack Solution

We do skip the Solution Checker step you can see in the Screenshot above since this is one of the QG’s we will talk about later.

The next step is to add an “Unpack solution” step to our pipeline. It should be configured as shown in the next screenshot.

Unpack Solution
Unpack Solution

Solution Input File should be the same as the output in the last step. In our case:

$(Build.ArtifactStagingDirectory)\$(SolutionName).zip

Target Folder to Unpack Solution should be the folder where you would like to store your unpacked solution in the repo. In our case, we will have a folder in the root which has the name of the solution.

$(Build.SourcesDirectory)\$(SolutionName)

5 – Commit solution to repo

The last step in the pipeline is to add the extracted solution to our repo. To do this we will add a standard “Command Line” step. There you will add the following code to “Script” field:

echo commit all changes
git config user.email "<email>"
git config user.name "Automatic Build"
git checkout master
git add --all
git commit -m "solution init"

echo push code to new repo
git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin master

You have to replace “<email>” with the email of the user you would like to use to push your changes.

6 – General Pipeline configuration

There are some general configurations you have to do to this pipeline.

For the agent, you have to allow scripts to use the OAuth token. If this is not configured our command-line script will not be able to connect to the repo and push our solution. The configuration should look like this

Agent configuration
Agent configuration

In our steps, we always used a variable called “SolutionName”. You change to the second tab of your pipeline, called “Variables”. By clicking on the “+ Add” text at the bottom you can add a new variable. The name should be “SolutionName” and the value should be the name of your solution. In our case “Demo”. The checkbox at the end of the line allows you to set change the solution name on queue time.

Variable configuration
Variable configuration

Now you can test the pipeline by running it. This can either be done via the “Queue” text if you are still in the edit mode

Queue the pipeline
Queue the pipeline

or by using the “Run pipeline” button

Run a pipeline
Run a pipeline

Build Managed Solution

The second pipeline we discussed is the one that will build a managed solution out of the unpacked solution we stored in our repo. It should have the following steps.

Complete Sub-process - Build Managed Solution
Complete Sub-process – Build Managed Solution

We will as well use the pipeline variable “SolutionName”.

1 – PowerApps Tool Installer

Like mentioned every pipeline needs this step.

2 – PowerApps Pack Solution

The second step is to create an unmanaged solution out of the unpacked solution we have stored in our repo. This will be done via a “Pack Solution” step which needs the following configuration.

Pack Solution
Pack Solution

The Source Folder should be the same as you have configured in the first pipeline as the output folder. In our case

$(Build.SourcesDirectory)\$(SolutionName)

As the output folder, we do configure

$(Build.ArtifactStagingDirectory)\$(SolutionName).zip

It is important, that you choose “unmanaged” as the solution type since we stored the unpacked unmanaged solution in the repo.

3 – PowerApps Import Solution

Now the just created unmanaged solutions need to be important in your build environment. To do this we will add an “Import Solution” step.

Import solution to build environment
Import solution to build environment

As an environment, you choose your build environment, which should not be your test or dev environment.

As the Solution Input, we choose the output of the previous step. For the demo, it should be

$(Build.ArtifactStagingDirectory)\$(SolutionName).zip

In one of the following posts, I will explain how to create an environment on the “fly” for this step.

4 – PowerApps Export Solution

The next to last step is to export the solution as managed from the built environment we just have imported it to. To achieve this we will use an “Export Solution” step with the configuration you can see on the screenshot.

Export managed solution
Export managed solution

As the environment, you choose the same as you have configured for the import solution step. Solution Name should be our variable

$(SolutionName)

The Solution Output File should be in our artifact directory and contain “_managed”.

$(Build.ArtifactStagingDirectory)\$(SolutionName)_managed.zip

Important is to check the checkbox beside “Export as Managed solution”.

5 – Publish Pipeline Artifacts

The last step is to publish the artifact we just created. For this, we do use the step called “Publish Pipeline Artifacts”

Publish Pipeline Artifacts
Publish Pipeline Artifacts

The name of the artifact is up to you. I have chosen “drop”.

The “Path to publish” should be the same as the output oath in the previous step. In our case:

$(Build.ArtifactStagingDirectory)\$(SolutionName)_managed.zip

Release to Test

Like mentioned is the last pipeline a rather short one. It should be a release pipeline that looks at the end like this.

Release Pipeline
Release Pipeline

1 – Artifacts

First of all, we have to choose the artifact of the build pipeline.

Artifact configuration
Artifact configuration

We choose our “Build Managed Solution” as the source.

2 – Tasks

The Tasks are like build pipelines. That means we do have to install our PowerApps tools first. In addition to that, we only have one more step. This is an “Import Solution” step with the following configuration.

Import Managed solution
Import Managed solution

As the environment, you choose your destination environment, in our case Test.

As the Solution input, you choose the artifact you configured above.

$(System.DefaultWorkingDirectory)/_Build Managed Solution/drop/$(SolutionName)_managed.zip

Summary

It does not take to long to set up a basic ALM process. When you are doing it for the first time it will take a bit longer, but you will have a fast learning curve.

Until now none of the pipelines is running automatically. All of them need manual queueing. I might create a post about how to automate those.

I would like to close todays post with a quote.

No project is to small for ALM.

Nick Doelman

This is a very good quote Nick Doelman said during one of his last Community sessions. I could not have expressed it better! With this said I can only encourage you to start with ALM. It will make your life easier.

In the upcoming posts, we will learn more about QG’s.

If you do have any questions do not hesitate to contact me.

2+
Tags:
9 Comments

Add a Comment

Your email address will not be published. Required fields are marked *