Power Platform & Tech Thoughts

Power Platform
&
Tech Thoughts

Azure DevOps ALM – Pipelines/Builds

Nov 17, 2023 | ALM

This post will discuss the first part of building your automation process, which includes exporting from your source environment and preparing the files for release.

Ensure you have read, understood and carried out the setup in the previous post, as you will not be able to connect to your source environment in a reliable way. Also ensure your environment, security and licensing strategy is defined and fixed before continuing, otherwise time consuming re-creation of your ALM processes will be required.

Azure DevOps will be referred to as “ADO”.

The whole group containing releases, task groups, etc will be referred to as “pipelines”

“Builds” are the previous term used for what is now “Pipelines”. There is also a service (groups on the left menu such as boards, repos, etc) names pipelines, this can be very confusing. For this guide, pipelines will refer to what you are building and “pipelines service” will refer to the menu item/group.

 

Preparation

Firstly, you will need to ensure you have a code repository (Git is recommended). If you have an existing repo you wish to use, I would recommend adding an identifiable folder such as “dataverse”, or create a new repo specifically for dataverse/PP/D365 work.

Next, create an initial set of folders as below. You could consider adding more in the future for additional usages such as backups of other components, plugin code, etc. For the location, “root” refers to within your “dataverse” directory or root of repo, depending what you have chosen above.

  • pipelines (root) – This will contain folders and files required for solution exports and imports.
  • Solution folders (within pipelines folder) – Create a folder for every solution you will use, named the same as the internal name of the solution (no spaces), not the display name.
  • solutions (root) – This will contain backups of your solutions after export.

Finally, visit pipelines > pipelines > all > new folder. Enter a name of your choosing to include your pipelines, as there will be many (something like “Dynamics Power Platform”.

Create YAML File

You will need to setup the initial file with the following steps. There is an otion for ADO to create the file for you, but this will be put into the root of the repository, which you may not want.

  • Check out your repo to your local machine and create a branch.
  • Within your pipelines folder, create a file with a meaningful description and .yml extension. An example is dataverse-export-solution.yml
  • Commit and push your branch to the remote repo.

Next, you will start creating the pipeline with this file.

  • Click 3 dots on the folder you created, then new pipeline.
  • Under where is your code, select Azure Repos Git.
  • Select the repository you created or are using in preparation.
  • Select Existing Azure Pipelines YAML file
  • Select the branch you created above.
  • Select the path to the file you created.
  • Select continue.

You will now have your empty file within the editor, allowing you to select options from the right, add variables, etc.

Finally, click the arrow next to run, then click Save. Then click the 3 dots, rename and enter a new name of your choice, I would recommend a description followed by the internal solution name, such as Export Solution – MySol1.

Pipeline Core Settings

In this section, we will add the initial variables and settings to the pipeline prior to the main steps. As futher steps and subsequent release depend on this information, it is key to understand why the setup exists..

First, you will need the basic pipeline starter information. Pool defines what virtual machine to run it on (linux, windows, etc). You also need a trigger value, in this case there is none as we are not running the pipeline based off a code check in, pull request, etc. Finally, a name is required which is used to name each instance/run of the pipeline. As we will be creating a pipeline for every solution with a unique name, this should be the version number as it will be easily identifiable and be useful later.

pool:
vmImage: windows-latest
trigger:
- none;
name: $(Date:yyyy.MM.dd)$(Rev:.r)

Next, we will setup a global variable, available to the whole pipeline but not settable.

  • Click the variables button in the top right.
  • Click the + button.
  • Enter your chosen variable name. To align with examples in this guide, I will use SolutionNameGlobal.
  • Value should reflect the internal name of your solution. In line with the prep section, I will use MySol1.
  • Ensure keep this value secret and let users override options are unticked.

Next, we will need to setup local variables (ones used within the pipeline but cannot be used or seen outside). Each of these are described below, with a code sample.

  • deploySettingsFullPath – This is local path (within the ADO build agent) where your git repo is checked out to and the json file within it. Replace your specific folders with the ones created in the preparation section.
  • solutionFilesDirectory – Although not strictly required, it will be very useful to have separation in folders for your solutions.
  • solutionManagedFileName – A fixed name for your exported managed file, this is very important for releases later.
  • solutionUnmanagedFileName – As above, but for unmanaged.
variables:
deploySettingsFullPath: $(Agent.BuildDirectory)\s\dataverse\pipelines-releases\$(SolutionNameGlobal)\deployment_settings.json
solutionFilesDirectory: $(Build.ArtifactStagingDirectory)\$(SolutionNameGlobal)
solutionManagedFileName: solution_managed.zip
solutionUnmanagedFileName: solution_unmanaged.zip

This concludes the initial core of your pipeline. The next section will cover the remaining steps, which are the actual exporting of the files.

Pipeline Steps

This where you will now add steps to your pipeline to perform actions, such as exporting, publishing artifacts, etc. On the right hand “tasks” panel, this will allow you to add items to the file and edit what is already there.

This guide will cover the following steps, to allow you to perform cross-environment releases:

  • Version chosen solution.
  • Export chosen solution.
  • Publish artifacts for release to use.

First, in your .yml file, underneath the variables, add the following line:

steps:

When you perform the next steps, ensure your cursor is underneath the code above. Open the tasks panel by clicking show assistant on the right and type “power platform” into search tasks. This will show you all tasks available to use in your pipeline. As per the previous guide, you will notice these are very similar to the CLI, as they are ultimately wrappers for that. When you add each item, you will be presented with fields to fill in before adding (apart from tool installer). We will now go through each task to add, with the fields to complete and an explanation of what it does.

Authentication type – where available, this should always be Service Principal regardless of the task, as it allows us to use the Azure App Registration.

Service connection – this will always be the name of the connection you created at the end of the last post for your dev environment.

  • Power Platform Tool Installer [this is always required once, prior to every series of PP tasks]
    • No values required
  • Power Platform Set Solution Version [as with manual releases, this is done prior to export]
    • Solution Name: $(SolutionNameGlobal) [the name of the global variable in the previous section]
    • Solution Version Number: $(Build.BuildNumber) [obtained from the name field populated earlier]
  • Power Platform Export Solution
    • Environment Url: $(BuildTools.EnvironmentUrl)
    • Solution Name: $(SolutionNameGlobal)
    • Solution Output File: $(solutionFilesDirectory)/$(solutionManagedFileName) [puts the managed zip file in the specific solution build directory]
    • Export as Managed Solution: ticked
    • Target Version: $(Build.BuildNumber) [to use the version set above]
    • Export async: ticked
    • Advanced > Overwrite local solution.zip: ticked
  • Power Platform Export Solution
    • Environment Url: $(BuildTools.EnvironmentUrl)
    • Solution Name: $(SolutionNameGlobal)
    • Solution Output File: $(solutionFilesDirectory)/$(solutionUnmanagedFileName) [puts the unmanaged zip file in the specific solution build directory]
    • Export as Managed Solution: unticked
    • Target Version: $(Build.BuildNumber)
    • Export async: ticked
    • Advanced > Overwrite local solution.zip: ticked
  • PowerShell
    • Type: inline [this means you don’t have to store a .ps1 file anywhere]
    • Script: cp “$(deploySettingsFullPath” “$(solutionFilesDirectory)” [this copies your deployment settings json file to the directory that will later be published as an artifact]
  • Publish build artifacts
    • Path to publish: $(solutionFilesDirectory) [2 solution zips and settings json are all here, so all 3 will be published for the release to use]
    • Artifact name: SolutionArtifacts [set this to whatever you like, just ensure you remember it for the release]
    • Artifact publish location: Azure Pipelines

Summary/Next…

This concludes your pipeline. From above, you should now see that both a managed and unmanaged solution have been exported from your dev environment, as well as the deployment settings, all ready for the next stage which is importing to your target environments.

Before moving on to the next post to start your releases, ensure you do the following. Repeat the Pipeline Core Settings and Pipeline Steps for every solution you have in your source environment. You will use the exact same yaml file, so the steps should already be there, but you will need to create a new global variable for each pipeline. The end result of this is if you have 5 solutions, you have 5 pipelines, as you will be triggering them individually. There will be a future blog post on how this all runs and ties together.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Share This