4 February 2025
Key takeways
Until recently, visualising data in PowerBI always required connecting to other platforms, such as Azure, SharePoint, or Excel. This meant that control of data storage was often outside the PowerBI ecosystem. However, with the latest Fabric update, it is now possible to create an end-to-end solution where the database and the PowerBI report reside in the same environment. In this guide, I explain how to set up this solution using five essential elements: Fabric SQL Database, Lakehouse, Dataflow, Data Pipeline and PowerBI Report.
SQL Database
The first step is to create a SQL database using Fabric SQL Database. This component works just like a traditional database, allowing you to ingest, update and query data. You can interact with the database using SQL queries or, more intuitively, using other tools such as Dataflows.
Lakehouse
Although Fabric SQL Database doesn’t offer a direct connection to PowerBI, Lakehouse does. Lakehouse thus becomes the ultimate data repository, allowing integration with various sources. This element is ideal for centralising data in one place, even connecting to several databases.
Dataflow
Dataflow is the component responsible for extracting, transforming and loading (ETL) data. Through it, you can configure how the data will be extracted, transformed and loaded into the repositories. For this solution, you can use one or two Dataflows:
- A Dataflow to connect the SQL Database to the Lakehouse.
- A second Dataflow (optional) to connect external data sources, such as Excel, to your database (if you opt for this alternative of ingesting data into the database).
If you choose to create the database conventionally via SQL, you won’t need to set up a Dataflow. All control will be done directly in the database.
Data Pipeline
Although Dataflows are enough to integrate data, adding automation, such as schedules and triggers, can be useful. The Data Pipeline allows you to build automated processes, linking different elements and defining actions for success or failure scenarios.
With Data Pipelines, you can schedule data loading as often as you like (daily, weekly, monthly, etc.) and set up notifications in the event of a process failure using communication platforms such as Outlook or WhatsApp. This gives you greater control over the entire data ecosystem at Fabric.
PowerBI Report
The final element of the solution is data visualisation via PowerBI Report. To use the data prepared in Lakehouse, set up an SQL Endpoint connection via the Lakehouse connector in PowerBI. You can then create and publish the desired reports directly, making the solution fully operational on the server.
Conclusion
With the latest version of Fabric, storing and exploiting data has become increasingly simple. With the elements and tips presented in this guide, creating end-to-end solutions becomes easy and secure while maintaining full control of the data in one place.