Power BI Deployment Options [Part 1]
Ever wondered how software engineers are able to work in massive teams on the same project/code and get changes pushed to the end users at frightening speed and consistency? Wonder no more! In this series, I will explain all there is to know about Application Lifecycle Management when using Power BI.
At the end of the series, you should have an idea of the different tools/techniques you can leverage to implement CI/CD in your own projects.
What is ALM?
Before going further, lets briefly explain what Application Lifecycle Management stands for.
The official definition is something in the line of “a multi-tier environmental developing approach”.
To show how this looks like I created this visual below, here you can see the most common environments and their use in regards to Power BI. The setup of these environments is often referred to as a DTAP street in software engineering.

The above environments and the descriptions are rather straightforward, only the development environment warrants further explanations. Here we have the following:
- Entry point ALM
- This because the developers should upload datasets and reports to this workspace. Once uploaded, these can be picked up and pushed to other environments.
- Blueprint Environment
- In order to propagate certain settings, like schedule refreshes or relations between datasets and reports we will often look at how these are setup in the development environment.
What is CI/CD?
When working with deployment pipelines CI and CD are crucial to understand. These stand for:
- Continues Integration
- Allows you to continuously integrate code into a single shared repository, usually the code is tested and reviewed before it ends up in the master repository.
- Continues Deployment
- Allows you to take the stored code and quickly distribute it to different environments.
These definitions are rather software engineering centric but they work perfectly in our Power BI context. You can still upload your files to a shared OneDrive or git repository and use tools like Azure DevOps to perform the continues deployment.
Deployment Option 1: Power BI Deployment Pipelines
The first deployment option that will be showcased are the Deployment Pipelines built into Power BI service. In order to make use of these pipelines there is one import requirement. You need to have either a Premium Capacity or a Premium Per User license to make use of this feature.
If you have either of these licenses you can create your first pipeline by selecting the Deployment Pipelines section in the left hand pane on app.powerbi.com

After creating a pipeline and giving it a name you can assign a workspace to any stage in the pipeline. I would recommend only setting a DEV workspace as the TEST and PROD workspace can be created upon deployment from stage to stage.

As an example I will demonstrate the two distinct rules that can be applied to datasets when deploying from environment to environment, these being:
- Data Source Rules
- Parameter Rules
In order to adjust these rules I already published my DEV resources to a TEST workspace. This because these rules can only be applied upon deployment. My pipeline now looks like this:

The lineage of my workspace looks like this:

Im order to make a clear destinction between a DEV and TEST environment I made sure to have two distinct excel files both containing a single line of data.

When we open the reports in the DEV workspace we can see the following:

Data Source Rules
Now that we have our resources in the TEST environment we can apply rules to the deployment of our datasets. To do so press the deployment settings button in the TEST workspace.

I will adjust the data source rules for the 1 Default Report dataset. As mentioned, this dataset connects to an excel file containing the DEV data. The only thing we have to do is change the url of the datasource to point towards the excel containing the TEST data.

After applying these data source rules and redeploying from DEV to TEST we can see that our 1 Default Report report in the TEST workspace now shows us the following:

So far so good, but what if you use parameters to create a distinction between data sources, or perhaps you use parameters to reduce the amount of data inside your DEV dataset. I will show you how to use those in the next section.
Parameter Rules
To showcase the parameter rules I will deploy the dataset and report named 2 Parameter Based Report from DEV to TEST. The only distinction between these and 1 Default Report is the following M code:

Here we can see that we use a parameter called environment and we concatenate it inside our M code to connect to a certain excel file. By default this will point to the DEV excel file.
To change this from DEV to TEST we once again press the deployment settings button in the TEST workspace. Then we select the 2 Parameter Based Report dataset and apply the rules as follows:

After applying these parameter rules and redeploying from DEV to TEST we can see that our 2 Parameter Based Report report in the TEST workspace now shows us the following:

Conclusion
In this blog I introduced ALM from a Power BI Service perspective. Including the Deployment Pipelines deployment option available to users with a Premium Capacity or Premium per User license. For all those without such a license be sure to keep an eye on the Lytix blogs as I will continue this series with at least 2 other options that will empower you to transition from Self-Service to Full-ALM!