In todays article I would like to describe how it is possible to apply a Solution Upgrade directly in your Azure DevOps pipeline.
This post can be considered the first post in a series of blog posts that talk about different approaches to solve challenges with Azure DevOps Pipelines that cannot be solved with the Power Platform Build Tools. The second blog post in that series explained how one could use a PowerShell Module to solve a problem.
Update 2021-09-20: Now there is a task in the Power Platform Build Tools to apply a solution upgrade. Unfortunately, the MS documentation lacks behind and does not mention the task yet.
Let me give you some background information first.
While working with managed solutions you do have two different options when importing a solution:
It will add components that have been added to the solution and update components. Unamanged solutions are always updates.
Upgrades will also delete components that where removed from the solution. This means you have the option to clean up not used components, like fields or entities. Upgrades are only available for managed solutions and when the version you are importing is higher than the present version.
In CDS you have the possibility to import an solution upgrade as an holding Solution. The new version will be in place but removed components will not be deleted yet.
The idea of those is that you are able to migrate data before you delete unused components, especially from fields and entities. Another possibility to have with holding solutions is rolling back an upgrade. Since the new version is already in place you could test it and delete the holding solution if anything is wrong. This would lead to having the old version only.
You have version 1.0 of you Solution installed in Production. It contains a field named “bebe_dmeo”. Since it is in production your users are using the field and you do have data in it.
You notice that the spelling is wrong. Of course you would like to fix that, to make the system easier to maintain and understand for future developers. So you delete the field in development and add a new field named “bebe_demo”.
The next step would be to install a solution upgrade to production (obviously after you tested it in the test environment) as a holding solution. With that approach you could move the data from “bebe_dmeo” to “bebe_demo”, since both are currently present in the environment
When applying the solution upgrade the removed field, “bebe_dmeo”, will be deleted from your production environment.
Important: When there is an unapplied solution upgrade in the system it is not possible to install another update or upgrade.
The “Import Solution” task of the Power Platform Build Tools has the option to install a solution as a “Holding Solution”. But they are missing the ability to apply solution upgrades.
UPDATE (2021-04-02): For a few days there is a task in the Power Platform Build Tools to apply solution upgrades.
As I mentioned one is not able to install neither a solution update nor upgrade if there is a unapplied upgrade present in the environment.
This leads to a manual step in every ALM process: Applying the Solution upgrade after deploying to another environment.
I would say that in 90% of the deployments you do not need to migrate data and could apply the upgrade right away. At least if you deploy to test or any other environment that is not production. Therefore we would like to apply the upgrade within our pipeline.
There are several solutions to the problem.
You do the step manually
You use additional Build Tools that provide the functionality. For example the “Power DevOps Tools” from Wael Hamze
You use a small application to do the job
In this blog post I will describe the last option and show you how can apply a solution upgrade with a small application.
The application has to execute a “DeleteAndPromoteRequest” (link to MS docs).
Here is the code of the application.
static void Main(string args)
var solutionName = args;
string connectionString = args.Length == 3 ? args + args + ";" : ConfigurationManager.ConnectionStrings["CRM"].ConnectionString;
throw new Exception("No ConnectionString found.");
CrmServiceClient client = new CrmServiceClient(connectionString);
var request = new DeleteAndPromoteRequest
UniqueName = solutionName
It takes 3 arguments
It generates the connectionstring out of the last two arguments. If those arent present it will take the fallback connectionstring configured in the App.config.
The next steps are to connect to the CDS environment, create the request and execute it.
The Application of the last paragraph should now be included in our pipeline.
For the ease of explaining this, we use a very easy release pipeline. In a real-world scenario, one would export the solution in a build pipeline and deploy it with a release pipeline (see my article about a basic ALM process).
Since we are using a release pipeline we have to add our repository as an artifact to be able to build the application.
To do so we change to the pipeline tab and click on “Add an artifact”.
In the upcomming side panel we choose “Azure Repository Git” (since my repo is a git repo in the same project). We also configure the Project, Source, and default branch.
We will need the “Source alias” in one of the pipeline steps, so make sure you copy it.
Our application needs 3 parameters. Those should be configured as variables to the pipeline.
Wee need the following variables
This variable contains the name of the Solution we would like to install and apply. By checking the checkbox “Settable at release time” we are able to choose the solution for every release.
This variable should contain the connection string to your org, without the password. It should look something like the following.
This variable will contain only the password of the user you are using. It is important to check the little lock so that DevOps knows that it contains a password. If this is checked DevOps will not show the content or write the content in logs.
To run the application we will add a few steps to the mentioned pipeline.
This Step will restore all the NuGet packages. This step can be used with the standard configuration.
This Step will build all the Solution in the artifact. This step can be used with the standard configuration. In you scenario you might need to change the config depending on your repository.
The last step will execute our application. As the path we choose the folder of the Artifact, in there we have a folder which is the name of the artifact (we configured this while adding the artifact) (for this demo it is “_ApplyUpgradeDemo”), in there you can find the application with its usual folderstructure . For our demo the path looks like this
If you have worked with pipelines earlier, you will notice that this syntax shows that we use the variables we configured earlier.
When running this pipeline it will export the solution, import it to the destination environment and apply the upgrade right away.
Adding possibility to choose
As mentioned earlier there might be situations where you would like to perform some steps between importing an upgrade and apllying it.
In this paragraph we will learn how to add the possibility to choose whether to apply the upgrade directly.
We will introduce an additional variable to the pipeline
This variable contains whether we would like to apply the upgrade right away. The default value will be “Yes”. By checking the checkbox “Settable at release time” we are able to choose it for every release.
Now we have to change the configuration of the last 3 the steps we added.
To do so we open the area “control Options”. There we choose “Custom Condition” in “Run this task” and fill in our custom condition.
This is just 1 of 58 articles. You can browse through all of them by going to the main page. Another possibility is to view the categories page to find more related content. You can also subscribe and get new blog posts emailed to you directly.
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.