How ALM streamlines BI projects: Azure DevOps
Application Lifecycle Management (ALM) refers to a (software) development process which has been setup in a governed and easy-to-manage way. ALM provides added value to the development team, project managers and the business users. While ‘ALM’ is mostly coined by pure software development projects (…written in 100% programming languages), BI projects (which are by nature less programmatic) can also adhere to these high standards.
In this blog you will get to know how we as Lytix adhere to ALM standards by using Azure DevOps. You will get to know the benefits and notice why it’s a ‘must-have’ component in BI implementations.
What problems does ALM solve?
When not working in an ALM-compliant manner, you often hear following complaints by business and development:
- I’ve accidentally made a change to an object (report, ETL task, …) and I only made a copy three weeks ago… All my intermediate changes have been overwritten!
- There are different versions circulating on the different environments; there are structural differences between the reports of the ‘production’ environment and the ‘business sandbox’ environment.
- I have no idea what the development team is currently doing, while they should prioritise bug fixing they are probably creating unwanted features.
- Why is it so hard to provide me a list of all changes that are made to resolve this problem?!
- Even the simplest test have not been carried out by the development team; if the developer had just had a quick look at the data, he would’ve known something was off.
- Can you make sure that a key-user tested and validated before you deploy to production?
- Manual scripts need to be run to orchestrate the deployment to production. A developer always needs to reserve some spare time to result in a successful deployment, therefore deploying is stressful.
Working in an ALM-compliant way resolves these complaints. Azure DevOps is one of those tools which help you in working in a better process-driven way in your development lifecycle. The functionalities ‘Boards, Pipelines, Repos and Test Plans’ make sure that all ALM aspects are covered.
Azure Boards provides Kanban boards and overviews of all kind of work items. As standard processes, Azure Boards provides the project methodologies ‘Agile’, ‘Scrum’ or ‘CMMI’ (Capability Maturity Model Integration). Depending on the nature of the project (large, medium, small) or the business requirements (needs to follow a certain standard), you can customise these processes until it fits your needs. Manage and plan the availability of your team, plan ahead, budget sprints and prioritise tasks. All this is possible in the Boards module.
E.g. for the development of our XTL Framework, we follow the Agile methodology. Epics, features, user stories, issues and bugs are defined. Based on a sprint planning and available days per consultant (all of which can be managed within Azure Boards), tasks are resolved. As such, a transparent view on the roadmap of our framework is always available.
Repositories, or Repos, store the different version of the code you are working upon (known as version-control). All changes made to your code are historically saved using a combination of branches, merges, commits, pushes and pull requests. A private repository (Git) is used to track the time of change and the user who made the change. As such, for each tiny piece of code, you can retrace which changes have happened per sprint or revert your code to a specific point in time. Linking changes to your code/reports creates a tight integration with your work items and your development. Retrace a code-change to a specific task and get to know the reason for the alteration.
A common misconception is that ‘all Git repositories are open source and thus globally accessible’. While Azure Repos provides a Git repository, this repository is only accessible by users of the project. All users wanting to view your code need to authenticate using their Microsoft account … which often also requires two-factor-authentication. Hence, you can be sure that the code you commit to your Azure Repository is only accessible by those allowed to see it.
Azure Pipelines make sure the written code passes some sanity tests (‘builds’ = making sure your code is consistent) and is then in a structural way deployed to new environments. Pipelines can be configured in such a way that no intervention is required from the BI team upon a deployment to a ‘higher’ environment (e.g. from QA to Production). It also makes sure that multiple environments are always kept in sync with each other.
If one of the stakeholders requires a new environment (e.g. a production-like sandbox for business), then this is just a few clicks on the button.
Given our XTL framework, we use following stages:
- The ‘_XTLFramework’ contains a consistent version of the code. A trigger is put in place that always releases to the following phase after a code-commit.
- The ‘Build’ phase is used as a no-data sanity environment. This is a first filter implemented to reach a higher quality of code and logic.
- At the end of each sprint, a business user accepts a new version of the code (= pre-deployment approval) which triggers an automatic deploy to the Test environment. Technical testers will try-out the functionalities and check if the requirements have been met (= post-deployment approval). Upon approval by the technical tester, the code is automatically deployed to the Quality Acceptance environment.
- Either the requester of the change or a key-business user (= functional tester) then evaluates the new functionality on the Quality Acceptance environment. Again, a formal approval of a key-user is required to continue to the next stage. Upon acceptance, code is automatically deployed to the ‘Business Evaluation’ environment and ‘Business Sandbox’ environment. The code is not automatically deployed to production.
- A ‘pre-deployment’ approval of the BI product owner is required before this version of the code is deployed to production. After acceptance, the release has fully reached all environments and is in sync.
In the setup above, no developer or technical interventions need to happen once the code has been committed. Every phase is pushed by the owner of that specific stage.
While throughout this article we’ve mainly provided examples of our XTL Framework, we configure CI/CD pipelines for all kind of software and tools:
- Cloud Components: Azure Data Factory, Azure SQL Database, Azure Synapse, SnowFlake, DataBricks, Azure Functions, …
- Business Applications: PowerApps, Power BI, Microsoft Automate/PowerFlow, …
- On-Premise Applications: SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), On-Premise SQL Server Databases, …
To demonstrate that newly developed code has been tested and proven of being of high quality, the ‘Test Plans’ module is used. These tests can either be executed manually or automatically. A pop-up shows testers the tests they should perform and allows them to easily log deviations from the expected process. All the test results can be accessed and are linked to sprints, work items and branches, thus enabeling 360° traceability of the improvements of your BI solution.
Good to Knows & Pricing
Most functionalities are free up to the first five ‘contributors’ (= people using DevOps that require more elevated rights than just viewing). As we like to keep our teams lean-and-mean, small to medium projects can often freely use this service. If you plan on using the ‘Test Plans’ then it is good to know that this requires a more costly license plan. Do you happen to have a Visual Studio Enterprise Subscription? Then it is automatically included in the license. Next to that, Azure DevOps is easily accessible by other tools. Easily export and manage your own data using Power BI, PowerApps and Power Automate.
Taking into account all the above functionalities, the low pricing and the perfect integration; Lytix is convinced that the Azure DevOps Suite provides added value in BI projects.
Sander Allert is an experienced BI architect with a passion for following new trends. Sander is passionate about data in all of its aspects (Big Data, Data Science, Self-Service BI, Master Data, …) and loves to share his knowledge. Do you need help on architectural decisions, do not hesitate to invite Sander over for a coffee to share some ideas.