For a hands-on learning experience to develop LLM applications, join our LLM Bootcamp today.
First 6 seats get an early bird discount of 30%! So hurry up!

In today’s dynamic digital world, handling vast amounts of data across the organization is challenging. It takes a lot of time and effort to set up different resources for each task and duplicate data repeatedly. Picture a world where you don’t have to juggle multiple copies of data or struggle with integration issues.

Microsoft Fabric makes this possible by introducing a unified approach to data management. Microsoft Fabric aims to reduce unnecessary data replication, centralize storage, and create a unified environment with its unique data fabric method. 

What is Microsoft Fabric?

Microsoft Fabric is a cutting-edge analytics platform that helps data experts and companies work together on data projects. It is based on a SaaS model that provides a unified platform for all tasks like ingesting, storing, processing, analyzing, and monitoring data.

With this full-fledged solution, you don’t have to spend all your time and effort combining different services or duplicating data.

 

Overview of One Lake - Microsoft Fabric
Overview of One Lake

 

Fabric features a lake-centric architecture, with a central repository known as OneLake. OneLake, being built on Azure Data Lake Storage (ADLS), supports various data formats, including Delta, Parquet, CSV, and JSON. OneLake offers a unified data environment for each of Microsoft Fabric’s experiences.

These experiences facilitate professionals from ingesting data from different sources into a unified environment and pipelining the ingestion, transformation, and processing of data to developing predictive models and analyzing the data by visualization in interactive BI reports.  

Microsoft Fabric’s experiences include: 

  • Synapse Data Engineering 
  • Synapse Data Warehouse 
  • Synapse Data Science 
  • Synapse Real-Time Intelligence 
  • Data Factory 
  • Data Activator  
  • Power BI

 

llm bootcamp banner

 

Exploring Microsoft Fabric Components: Sales Use Case

Microsoft Fabric offers a set of analytics components that are designed to perform specific tasks and work together seamlessly. Let’s explore each of these components and its application in the sales domain: 

Synapse Data Engineering:

Synapse Data Engineering provides a powerful Spark platform designed for large-scale data transformations through Lakehouse.

In the sales use case, it facilitates the creation of automated data pipelines that handle data ingestion and transformation, ensuring that sales data is consistently updated and ready for analysis without manual intervention.

Synapse Data Warehouse:

Synapse Data Warehouse represents the next generation of data warehousing, supporting an open data format. The data is stored in Parquet format and published as Delta Lake Logs, supporting ACID transactions and enabling interoperability across Microsoft Fabric workloads.

In the sales context, this ensures that sales data remains consistent, accurate, and easily accessible for analysis and reporting. 

Synapse Data Science:

Synapse Data Science empowers data scientists to work directly with secured and governed sales data prepared by engineering teams, allowing for the efficient development of predictive models.

By forecasting sales performance, businesses can identify anomalies or trends, which are crucial for directing future sales strategies and making informed decisions.

 

data science bootcamp banner

 

Synapse Real-Time Intelligence:

Real-Time Intelligence in Synapse provides a robust solution to gain insights and visualize event-driven scenarios and streaming data logs. In the sales domain, this enables real-time monitoring of live sales activities, offering immediate insights into performance and rapid response to emerging trends or issues.  

Data Factory:

Data Factory enhances the data integration experience by offering support for over 200 native connectors to both on-premises and cloud data sources.

For the sales use case, this means professionals can create pipelines that automate the process of data ingestion, and transformation, ensuring that sales data is always updated and ready for analysis.  

Data Activator:

Data Activator is a no-code experience in Microsoft Fabric that enables users to automatically perform actions on changing data on the detection of specific patterns or conditions.

In the sales context, this helps monitor sales data in Power BI reports and trigger alerts or actions based on real-time changes, ensuring that sales teams can respond quickly to critical events. 

Power BI:

Power BI, integrated within Microsoft Fabric, is a leading Business Intelligence tool that facilitates advanced data visualization and reporting.

For sales teams, it offers interactive dashboards that display key metrics, trends, and performance indicators. This enables a deep analysis of sales data, helping to identify what drives demand and what affects sales performance.

 

Learn how to use Power BI for data exploration and visualization

 

Hands-on Practice on Microsoft Fabric:

Let’s get started with sales data analysis by leveraging the power of Microsoft Fabric: 

1. Sample Data

The dataset utilized for this example is the sample sales data (sales.csv). 

2. Create Workspace

To work with data in Fabric, first create a workspace with the Fabric trial enabled. 

  • On the home page, select Synapse Data Engineering.
  • In the menu bar on the left, select Workspaces.
  • Create a new workspace with any name and select a licensing mode. When a new workspace opens, it should be empty.

 

Creating workspace on Microsoft Fabric

 

3. Create Lakehouse

Now, let’s create a lakehouse to store the data.

  • In the bottom left corner select Synapse Data Engineering and create a new Lakehouse with any name.

 

creating lakehouse - Microsoft Fabric

 

  • On the Lake View tab in the pane on the left, create a new subfolder.

 

lake view tab - Microsoft Fabric

 

4. Create Pipeline

To ingest data, we’ll make use of a Copy Data activity in a pipeline. This will enable us to extract the data from a source and copy it to a file in the already-created lakehouse. 

  • On the Home page of Lakehouse, select Get Data and then select New Data Pipeline to create a new data pipeline named Ingest Sales Data. 
  • The Copy Data wizard will open automatically, if not select Copy Data > Use Copy Assistant in the pipeline editor page. 
  • In the Copy Data wizard, on the Choose a data source page select HTTP in the New sources section.  
  • Enter the settings in the connect to data source pane as shown:

 

connect to data source - Microsoft Fabric

 

  • Click Next. Then on the next page select Request method as GET and leave other fields blank. Select Next. 

 

Microsoft fabric - sales use case 1

microsoft fabric sales use case 2

microsoft fabric - sales use case 3

microsoft fabric sales use case 4

 

  • When the pipeline starts to run, its status can be monitored in the Output pane. 
  • Now, in the created Lakehouse check if the sales.csv file has been copied. 

5. Create Notebook

On the Home page for your lakehouse, in the Open Notebook menu, select New Notebook. 

  • In the notebook, configure one of the cells as a Toggle parameter cell and declare a variable for the table name.

 

create notebook - microsoft fabric

 

  • Select Data Wrangler in the notebook ribbon, and then select the data frame that we just created using the data file from the copy data pipeline. Here, we changed the data types of columns and dealt with missing values.  

Data Wrangler generates a descriptive overview of the data frame, allowing you to transform, and process your sales data as required. It is a great tool especially when performing data preprocessing for data science tasks.

 

data wrangler notebook - microsoft fabric

 

  • Now, we can save the data as delta tables to use later for sales analytics. Delta tables are schema abstractions for data files that are stored in Delta format.  

 

save delta tables - microsoft fabric

 

  • Let’s use SQL operations on this delta table to see if the table is stored. 

 

using SQL operations on the delta table - microsoft fabric

 

How generative AI and LLMs work

 

6. Run and Schedule Pipeline

Go to the already created pipeline page, add Notebook Activity to the completion of the copy data pipeline, and follow these configurations. So, the table_name parameter will override the default value of the table_name variable in the parameters cell of the notebook.

 

abb notebook activity - microsoft fabric

 

In the Notebook, select the notebook you just created. 

7. Schedule and Monitor Pipeline

Now, we can schedule the pipeline.  

  • On the Home tab of the pipeline editor window, select Schedule and enter the scheduling requirements.

 

entering scheduling requirements - microsoft fabric

 

  • To keep track of pipeline runs, add the Office Outlook activity after the pipeline.  
  • In the settings of activity, authenticate with the sender account (use your account in ‘To’). 
  • For the Subject and Body, select the Add dynamic content option to display the pipeline expression builder canvas and add the expressions as follows. (select your activity name in ‘activity ()’)

 

pipeline expression builder - microsoft fabric

pipeline expression builder 2 - microsoft fabric

loading dynamic content - microsoft fabric

 

8. Use Data from Pipeline in PowerBI

  • In the lakehouse, click on the delta table just created by the pipeline and create a New Semantic Model.

 

new semantic model - microsoft fabric

 

  • As the model is created, the model view opens click on Create New Report.

 

sales - microsoft fabric

 

  • This opens another tab of PowerBI, where you can visualize the sales data and create interactive dashboards.

 

power BI - microsoft fabric

 

Choose a visual of interest. Right-click it and select Set Alert. Set Alert button in the Power BI toolbar can also be used.  

  • Next, define trigger conditions to create a trigger in the following way:

 

create a trigger - microsoft fabric

 

This way, sales professionals can seamlessly use their data across the platform by transforming and storing it in the appropriate format. They can perform analysis, make informed decisions, and set up triggers, allowing them to monitor sales performance and react quickly to any uncertainty.

 

Explore a hands-on curriculum that helps you build custom LLM applications!

 

Conclusion

In conclusion, Microsoft Fabric as a revolutionary all-in-one analytics platform simplifies data management for enterprises. Providing a unified environment eliminates the complexities of handling multiple services just by being a haven where data moves in and out all within the same environment for ease of ingestion, processing, or analysis.

With Microsoft Fabric, businesses can streamline data workflows, from data ingestion to real-time analytics, and can respond quickly to market dynamics.

 

Want to learn more about Microsoft Fabric? Join our crash course today for a comprehensive understanding!

microsoft fabric webinar banner

Imagine effortlessly asking your business intelligence dashboard any question and receiving instant, insightful answers. This is not a futuristic concept but a reality unfolding through the power of Large Language Models (LLMs).

Descriptive analytics is at the core of this transformation, turning raw data into comprehensible narratives. When combined with the advanced capabilities of LLMs, Business Intelligence (BI) dashboards evolve from static displays of numbers into dynamic tools that drive strategic decision-making. 

LLMs are changing the way we interact with data. These advanced AI models excel in natural language processing (NLP) and understanding, making them invaluable for enhancing descriptive analytics in Business Intelligence (BI) dashboards.

 

LLM bootcamp banner

 

In this blog, we will explore the power of LLMs in enhancing descriptive analytics and its impact of business intelligence dashboards.

Understanding Descriptive Analytics

Descriptive analytics is the most basic and common type of analytics that focuses on describing, summarizing, and interpreting historical data.

Companies use descriptive analytics to summarize and highlight patterns in current and historical data, enabling them to make sense of vast amounts of raw data to answer the question, “What happened?” through data aggregation and data visualization techniques.

The Evolution of Dashboards: From Static to LLM

Initially, the dashboards served as simplified visual aids, offering a basic overview of key metrics amidst cumbersome and text-heavy reports.

However, as businesses began to demand real-time insights and more nuanced data analysis, the static nature of these dashboards became a limiting factor forcing them to evolve into dynamic, interactive tools. The dashboards transformed into Self-service BI tools with drag-drop functionalities and increased focus on interactive user-friendly visualization.

This is not it, with the realization of increasing data, Business Intelligence (BI) dashboards shifted to cloud-based mobile platforms, facilitating integration to various data sources, and allowing remote collaboration. Finally, the Business Intelligence (BI) dashboard integration with LLMs has unlocked the wonderful potential of analytics.

 

Explore the Top 5 Marketing Analytics Tools for Success

 

Role of Descriptive Analytics in Business Intelligence Dashboards and its Limitations

Despite of these shifts, the analysis of dashboards before LLMs remained limited in its ability to provide contextual insights and advanced data interpretations, offering a retrospective view of business performance without predictive or prescriptive capabilities. 

The following are the basic capabilities of descriptive analytics:

Defining Visualization

Descriptive analytics explains visualizations like charts, graphs, and tables, helping users quickly grasp key insights. However, this requires manually describing the analyzed insights derived from SQL queries, requiring analytics expertise and knowledge of SQL. 

Trend Analysis

By identifying patterns over time, descriptive analytics helps businesses understand historical performance and predict future trends, making it critical for strategic planning and decision-making.

However, traditional analysis of Business Intelligence (BI) dashboards may struggle to identify intricate patterns within vast datasets, providing inaccurate results that can critically impact business decisions. 

Reporting

Reports developed through descriptive analytics summarize business performance. These reports are essential for documenting and communicating insights across the organization.

However, extracting insights from dashboards and presenting them in an understandable format can take time and is prone to human error, particularly when dealing with large volumes of data.

 

How generative AI and LLMs work

 

LLMs: A Game-Changer for Business Intelligence Dashboards

Advanced Query Handling 

Imagine you would want to know “What were the top-selling products last quarter?” Conventionally, data analysts would write an SQL query, or create a report in a Business Intelligence (BI) tool to find the answer. Wouldn’t it be easier to ask those questions in natural language?  

LLMs enable users to interact with dashboards using natural language queries. This innovation acts as a bridge between natural language and complex SQL queries, enabling users to engage in a dialogue, ask follow-up questions, and delve deeper into specific aspects of the data.

Improved Visualization Descriptions

Advanced Business Intelligence (BI) tools integrated with LLMs offer natural language interaction and automatic summarization of key findings. They can automatically generate narrative summaries, identify trends, and answer questions for complex data sets, offering a comprehensive view of business operations and trends without any hustle and minimal effort.

Predictive Insights

With the integration of a domain-specific Large Language Model (LLM), dashboard analysis can be expanded to offer predictive insights enabling organizations to leverage data-driven decision-making, optimize outcomes, and gain a competitive edge.

Dashboards supported by Large Language Mode (LLMs) utilize historical data and statistical methods to forecast future events. Hence, descriptive analytics goes beyond “what happened” to “what happens next.”

Prescriptive Insights

Beyond prediction, descriptive analytics powered by LLMs can also offer prescriptive recommendations, moving from “what happens next” to “what to do next.” By considering numerous factors, preferences, and constraints, LLMs can recommend optimal actions to achieve desired outcomes. 

 

Read more about Data Visualization

 

Example – Power BI

The Copilot integration in Power BI offers advanced Business Intelligence (BI) capabilities, allowing you to ask Copilot for summaries, insights, and questions about visuals in natural language. Power BI has truly paved the way for unparalleled data discovery from uncovering insights to highlighting key metrics with the power of Generative AI.

Here is how you can get started using Power BI with Copilot integration;

Step 1

Open Power BI. Create workspace (To use Copilot, you need to select a workspace that uses a Power BI Premium per capacity, or a paid Microsoft Fabric capacity).

Step 2

Upload your business data from various sources. You may need to clean and transform your data as well to gain better insights. For example, a sample ‘sales data for hotels and resorts’ is used here.

 

Uploading data - business intelligence dashboards
Uploading data

 

Step 3

Use Copilot to unleash the potential insights of your data. 

Start by creating reports in the Power BI service/Desktop. Copilot allows the creation of insightful reports for descriptive analytics by just using the requirements that you can provide in natural language.  

For example: Here a report is created by using the following prompt:

 

report creation prompt using Microsoft Copilot - business intelligence dashboards
An example of a report creation prompt using Microsoft Copilot – Source: Copilot in Power BI Demo

 

Copilot has created a report for the customer profile that includes the requested charts and slicers and is also fully interactive, providing options to conveniently adjust the outputs as needed. 

 

Power BI report created using Microsoft Copilot - business intelligence dashboards
An example of a Power BI report created using Microsoft Copilot – Source: Copilot in Power BI Demo

 

Not only this, but you can also ask analysis questions about the reports as explained below.

 

asking analysis question from Microsoft Copilot - business intelligence dashboards
An example of asking analysis question from Microsoft Copilot – Source: Copilot in Power BI Demo

 

The copilot now responds by adding a new page to the report. It explains the ‘main drivers for repeat customer visits’ by using advanced analysis capabilities to find key influencers for variables in the data. As a result, it can be seen that the ‘Purchased Spa’ service has the biggest influence on customer returns followed ‘Rented Sports Equipment’ service.

 

example of asking analysis question from Microsoft Copilot - business intelligence dashboards
An example of asking analysis questions from Microsoft Copilot – Source: Copilot in Power BI Demo

 

Moreover, you can ask to include, exclude, or summarize any visuals or pages in the generated reports. Other than generating reports, you can even refer to your existing dashboard to question or summarize the insights or to quickly create a narrative for any part of the report using Copilot. 

Below you can see how the Copilot has generated a fully dynamic narrative summary for the report, highlighting the useful insights from data along with proper citation from where within the report the data was taken.

 

narrative generation by Microsoft PowerBI Copilot - business intelligence dashboards
An example of narrative generation by Microsoft Power BI Copilot – Source: Copilot in Power BI Demo

 

Microsoft Copilot simplifies Data Analysis Expressions (DAX) formulas by generating and editing these complex formulas. In Power BI, you can easily navigate to the ‘Quick Measure’ button in the calculations section of the Home tab. (if you do not see ‘suggestions with Copilot,’ then you may enable it from settings.

Otherwise, you may need to get it enabled by your Power BI Administrator).

Quick measures are predefined measures, eliminating the need for creating your own DAX syntax. It’s generated automatically according to the input you provide in Natural Language via the dialog box. They execute a series of DAX commands in the background and display the outcomes for utilization in your report.

 

Quick Measure – Suggestions with Copilot - business intelligence dashboards
Quick Measure – Suggestions with Copilot

 

In the below example, it can be seen that the copilot gives suggestion for a quick measure based on the data, generating the DAX formula as well. If you find the suggested measure satisfactory, you can simply click the “Add” button to seamlessly incorporate it into your model.

 

DAX generation using Quick Measure - business intelligence dashboards
An example of DAX generation using Quick Measure – Source: Microsoft Learn

 

There can be several other things that you can do with copilot with clear and understandable prompts to questions about your data and generate more insightful reports for your Business Intelligence (BI) dashboards.  

Hence, we can say that Power BI with Copilot has proven to be the transformative force in the landscape of data analytics, reshaping how businesses leverage their data’s potential.

 

Explore a hands-on curriculum that helps you build custom LLM applications!

 

Embracing the LLM-led Era in Business Intelligence

Descriptive analytics is fundamental to Business Intelligence (BI) dashboards, providing essential insights through data aggregation, visualization, trend analysis, and reporting. 

The integration of Large Language Models enhances these capabilities by enabling advanced query handling, improving visualization descriptions, and reporting, and offering predictive and prescriptive insights.

This new LLM-led era in Business Intelligence (BI) is transforming the dynamic landscape of data analytics, offering a glimpse into a future where data-driven insights empower organizations to make informed decisions and gain a competitive edge.