Imagine having your Data Model ready and everyone loves it. Then you get this question all of a sudden.
“I have made this excel file and I would like to use and combine it with the Sales Dataset in PowerBI.”
The next thing you do is analyze the request and for some reason, maybe you don’t want to flood the Datawarehouse with Excel Files or maybe your ETL team is too busy with requests, but you have to say no.
The business user is quite disappointed because it would really have solved that time-consuming task he/she had to do every week.
What if I told you that all the above can be avoided with a very simple solution?
Composite Models are the answer!
These models are a way to seamlessly combining different data sources in PowerBI.
With composite models you can connect to your Azure Analysis Service (AAS) model or Power BI dataset and enrich them even further by connecting to a second model, adding flat files, or even just the ability to build measures on top of your AAS model.
This is all super exciting for both business users and BI developers as it will create even more analytical opportunities.
Now, I know what you are thinking. Let’s take a look at this feature!
I love your eagerness but before we dive into it, you should know that this feature has the following Limitations.
The following Live Connect multi-dimensional sources can’t be used with composite models:
SAP Business Warehouse
SQL Server Analysis Services
Power BI datasets
Azure Analysis Services
As of December 2020, DirectQuery for PBI and Analysis Services is available in Public Preview. If you would like to use Composite models with PBI Datasets and AAS you should enable this feature.
Now, with that out of the way, let’s get to the exciting part!
In our Example, we will be using a live connection to a PBIDataset and an Imported Excel File.
First of all we will be connecting to the PBI Dataset we want to use. Once this is done and you are all ready and eager to add a different source to the model, or to just add your own calculated columns, you will need to go to Transform Data.
You will be prompted with the above information.
As stated previously a DirectQuery Connection is required. This is the reason the DirectQuery for PBI Datasets and Analysis Services was required.
It will also notify you that this change is permanent. You cannot change this back once performed and saved. This is very important to keep in mind.
When we are ready to continue we can click on the Add to Local Model button and a DirectQuery Connection will be made.
You know it will be successful when you see the following text in the bottom right corner.
Now we can do quite a few more things more we couldn’t before to the already existing model.
For example, if you previously right clicked on a table you got the below options.
However, now we can actually create new Columns along with renaming/hiding Tables, Measures and Attributes and create relationships.
This last one will be very handy when adding different sources to your model.
Now we can actually add a different source file to our model. This can go from Excel or Sharepoint files to different PBI Datasets.
Once you add another data source you will once again be prompted with a message.
Of course, when combining different data sources through DirectQuery there is always the possibility that data can be included in queries sent from one data source to the other.
Always be certain of the sources you are pulling your data from. Once you are sure this is no issue for you, we can continue.
Both models will now exist separately from one another and can be joined through a relationship you define if required.
Different visuals can now be built using data from both (Or more) sources.
All of this is amazing and a great leap forward for both developers and advanced business users.
PowerBI is a tool that keeps improving and each iteration brings in more capabilities to deliver high-quality dashboards to the end-user in a short amount of time.
If you are interested in getting to know more about Composite Models or other Microsoft BI-related aspects you can always contact Lytix.
This day in age, apps are all around us. From our phones to our laptops, even in stores and restaurants, we can sometimes find apps. An app supports business processes in consulting balances, inputting records, grouping products, activity planning and making prognoses. However, to make those apps, someone with the right expertise needs to spend time making an app and implementing it in a system that does not always fully support it. All this demands so much time that there is no way to simply make small apps for processes like keeping inventory of company assets or inputting KPI’s. Unless you use Power Apps… From making an app which allows you to order sandwiches, to making one that makes accounting easier. You can do it all with Power Apps.
Would you like to possess the skill set to develop your own App with the Power Platform? if you can imagine it, PowerApps can build it. Low-code & No-code to the rescue to enhance your organization’s digital agility. Feel free to subscribe on this online webinar by using the following link
What is Power Apps?
Power Apps is an App-building tool that is part of the Power-platform from Microsoft. PowerApps’ goal is assisting business users in creating their own applications and significantly reducing development time. You can make applications for desktop, tablet AND mobile depending on how you want to use them. It integrates well with O365 and is even able to be implemented in Power BI reports and vice versa. This makes it a standardized platform for which account management falls under O365. Power Apps allows anyone who has an O365 within the organization to build apps in a simple way, with minimal code, for the whole organization. In just one day it is possible to make a value-added application with requirement only specific to the organization. You can link your Power App with external data sources or even with Microsoft’s ‘Common Data Service (CDS)’. Using Power Automate on top of that, you can expand the possibilities with a huge factor. Next to assisting in basic processes in your organization, there is also an AI-builder in Power Apps. With this feature, you can scan business cards, forms, objects and text in images. Most of these features are still in preview and require you to train an AI-model.
Common Data Services (CDS)
Think of CDS as a centralized data source for your Power App as well as other tools like Sharepoint and Power BI. However, if a CDS is not the best choice for your organization it is always possible to use other data sources.
Power Automate allows us to build workflows under guidance without using code and/or cloud services and makes it possible to make workflows in a small or big context. It automates time-consuming tasks with built-in AI. It falls, just like Power Apps, under the collection of the Power Platform with the added advantages of course.
Here at Lytix we keep up with the latest technologies, this includes Power Apps. Power Apps in itself is not that difficult to work with. That’s why we at Lytix can already deliver a Power App specific for your company within 1-2 days tops! Having said that, it is mainly the fact that this uses a standardized platform to work on and the use of the principle of “low code, no code” that makes it possible to deploy perfectly well-functioning apps within 2 days. With the recent corona outbreak, it would be possible to make a quick app with standard questions and connect it to a database for reporting within a couple of hours. Of course, it’s not expected that someone can do advanced apps completely at the start. That’s why we offer guidance, processes and best practices to our customers while developing new apps.
In one out of three customers, we use PowerApps. The apps we implement all have different uses: from managing sales goals, managing the calendar for marketing or creating a master data environment (… and getting rid of manual excel files). PowerApps allows our customers to get an easy overview of how well a certain store/department/branch is doing compared to the sales goals as well as allowing them to change already existing or fill in new goals. Another app makes it possible for our customers to easily connect promotions and customer segments to each other in the database. Next to that, our customer can also add new actions and customer segments straight into their database allowing immediate reporting on it as well in case the customer so desires.
In a nutshell
Power Apps benefits
your organization in the following ways:
and easy apps to complete the company’s business processes.
familiar and standardized way to develop apps makes Power Apps more accessible
to employees all over the organization.
code, no code.
and account usage happens within the O365 platform making it a central point
for account management.
Common Data Service (CDS) allows the dataset to be centralized and used by
other apps that are part of the Power Platform.
If you want to know more about Power Apps and the Power Platform or hire us, you can contact us through the following channels.
‘Cloud’ or ‘Clowned’? With the high adoption of cloud computing in all processes, ‘Analytics’ is the one that can benefit from it the most. Analytics as being it a data warehouse that is used for reporting or ad-hoc analysis, Data Lake which serves as a repository for data that contains added value and Big Data in which one server isn’t sufficient to meet your analytical needs. Typical to analytics is combining different sources, a high variation in its use (at some days it is used highly intensive, while at others it is not consulted) and the need for lots of computational power.
As a Cloud PaaS solution Snowflake is taking the Data world by storm. This tool has a lot of benefits but is it really as good as it claims to be? Can it address the challenges listed above? This blog post will go through the most prominent advantages and disadvantages of Snowflake.
But… what is PaaS?
Before we dive into the nitty-gritty of Snowflake it is important we define what a ‘Platform as a Service’ (PaaS), is.
As the name describes, when a platform-based service is offered to be used by customers where they can execute the required developments without having to take care of the underlying infrastructure. This infrastructure is provided and maintained by the owner of the PAAS Solution.
So, everything is set for me to develop! What are the limitations then?
Well, exactly as you say. Everything is set up
for you so you cannot change anything in the infrastructure. Any limitations
that the Solution has, you will need to either live with or hope that it is
solved by the provider.
This is an issue you don’t have when it is
developed by yourself. If you need some additional infrastructure developments
or changes you can plan and execute it yourself.
It is always best to look at what the PAAS solution can offer you and if it ticks all the boxes you need.
Where does snowflake fit in here?
Snowflake offers a lot of capabilities you can use during your journey through data without having the issue of taking care of your environment. All the technical set-up is done for you, hence it is PaaS.
What makes snowflake so special besides being PaaS?
We will look into a few aspects that make
One of the main selling points of Snowflake is the fact that is it fully cloud-based and not an on-premises system made available in the cloud.
It’s available every major cloud service on the
market, meaning Azure, AWS and Google cloud. On top of this Snowflake maintains
a Cross-Cloud strategy.
This strategy ensures the following:
possibility to negotiate the best cloud environment without having any
case your chosen cloud provider faces an outage you can simply switch to
another cloud provider and ensure uptime of your systems.
Data Storage Capacity
You often hear data is
the new currency. Nothing could be more true in the business world. Companies
keep collecting data and so their volumes keep growing.
Luckily Snowflake has no
limit on their data storage. As the storage and compute layers are separated,
you can just dump all relevant business data into SnowFlake which will then
treat it as a Data Lake (= read: cheap storage and an infinite storage amount).
Of course you might not need a large amount now, but during the years to come you expect your data volumes to increase. If you are provided with a large volume of value-added data, then you do not need to hesitate and can directly add it to SnowFlake. All data residing within SnowFlake (= your data lake) is directly accessible.
feature based on Data Storage is the fact that if you copy a database, it will
not take double the space. It will recognize the records that are the same and
it will only require space for different or new records.
Snowflake often uses the phrase ‘Pay for what
you need’ which is exactly what the dynamic scaling is for. This is best
explained with an example;
Imagine you run your daily load in the morning between 3 AM and 4 AM. At the start of the working day, a lot of users are interested in the latest insights which are available within SnowFlake (let’s say: between 8 AM and 10 AM). For the rest of the day, only one super users sends maybe one query now and then.
In more traditional cost-effective Cloud scenario’s
(= not SnowFlake), you would start-up your BI system only for the moments which
it is planned to use (= 3 AM and 4 AM, and 8 AM and 18 AM). Your BI environment
takes a while to spin up and spin down.
SnowFlake doesn’t incur a delay in ‘spinning-up and down’ and is thus directly accessible, even if no cluster is running. We often configure it in such a way that it is immediately spins-up whenever someone sends a query and automatically spins down after one minute of inactivity. Paying only when compute is being used, will save lots of money for your organization!
“SnowFlake directly spins-up when users issue queries at the database, even at an idle state. In addition, SnowFlake is automatically paused after one minute of inactivity. This combination saves a high amount of infrastructural costs which would otherwise be billed.”
Instant Adjustable Processing
One of the most interesting features is the ability to change the size of your warehouse. The bigger the size, the faster big queries can be executed. This can once again be changed on the fly depending on the needs of the organization at that point in time. SnowFlake uses T-Shirt sizes (XS, S, M, L, XL, XXL, XXXL) and each size is double the size of the previous (= Medium equals twice the processing power of Small).
It is very important to
optimize your query usage as much as possible since you are paying per usage.
Luckily Snowflake has this covered for you!
If you execute a query
now and the data doesn’t change it will be cached in Snowflake’s engine.
This means that the second time you
execute the query, it will not cost you anything as you will not be querying
your data directly but the cached version.
quite a lot of features! Are there any others?
are. We won’t be going in detail about all of them but here are some smaller,
but also interesting features.
Time-Travel Feature: this is the ability to view data from the past.
Copy-Clone: You can Copy an existing database that is directly available for use.
You can Query directly on JSON, XML… files.
Active Directory Integration
there any aspects you would say Snowflake is not suited for?
You must keep in mind Snowflake in its core designed for handling Big Data, Data Warehouses and so on. If your interest/use case lies in Data Science we would suggest you look at tools built with this in mind. We would suggest DataBricks and Spark. More info regarding DataBricks can be found on our blog here.
interesting! How can I get in touch?
If you need
help (development, architecture, strategy, …) on one of your BI implementations,
then Lytix will help you with this. Lytix believes in the SnowFlake platform
and is an expert in the advantages and disadvantages it has. Our ETL framework ‘XTL’
uses SnowFlake as a database engine and can provide you with a head-start in an