In today’s fast-paced world, businesses demand real-time insights to stay competitive and agile. By integrating Azure, Python, and Power BI, you can create powerful dashboards that update automatically with live data no manual refreshes, no delays. In this blog, you’ll learn how to fetch real-time exchange rate data from an open API, transform it with Python, and visualize it instantly using Power BI. Whether you’re in finance, operations, or customer analytics, this end-to-end solution will improve decision-making, streamline processes, and deliver impactful insights right when you need them.

Explore Azure Cloud
Azure Cloud, officially known as Microsoft Azure, is a comprehensive cloud computing service by Microsoft. It facilitates the building, testing, deploying, and management of applications through Microsoft-managed data centers. Azure offers a broad range of services, including computing power, storage, analytics, and tools for developers.
Explore benefits of using Python-Azure-Power BI
Combining Python, Azure, and Power BI provides significant advantages for data processing, analysis, and visualization. These technologies integrate seamlessly, offering a powerful solution for developing data-driven applications. Real-time data is crucial for timely decision-making, operational efficiency, enhancing customer experiences, and staying competitive. It allows organizations to monitor and respond to dynamic situations and capitalize on new opportunities.
Explore topic that we covered?

This post covers how to leverage Event Hub, Azure Stream Analytics, and Power BI to build an end-to-end real-time data analytics and visualization solution. We’ll demonstrate how to fetch real-time exchange rate data from the Exchange Rates API and create dynamic visualizations. Register at Exchange Rates API to access the necessary API.
Azure Event Hubs
Azure Event Hubs is a cloud-based event streaming platform that allows businesses to collect, process, and analyze streaming data from various sources in real-time. It is part of the Azure Integration Services suite of products and services.

“Resource Group” Azure uses a structure called “Resource Group” to organize resources. Each resource is located within a Resource Group, and these groups provide easier management and monitoring of resources within the same project.
“Pricing tier” Azure Event Hubs offers two different pricing options, Basic and Standard, with Premium providing a higher level of performance and advanced features. Basic is designed for small-scale usage and offers less processing capacity and features. Standard provides moderate processing capacity, moderate features, and moderate pricing. Premium is designed for workloads with large-scale and high-volume data processing requirements.
“Throughput units” Event Hubs uses the concept of “throughput units” (TU) as a capacity measure for high-volume data exchange. Each TU has a processing capacity that supports 1 MB of data streaming per hour. However, increasing the number of TUs also increases the cost.
Event Hubs — Advanced

“Minimum TLS Version” option is a feature used to enhance the security of data coming into the Event Hub.
“Local Authentication” option is used to manage the authentication of applications that want to access the Event Hub.
Event Hubs — Networking:
The “Public Access” option grants permission for accessing the Event Hub over the internet, while the “Private Access” option grants permission for accessing the Event Hub from a private network.

Event Hubs — Tags

“Tags” option allows you to manage and categorize your Event Hub resources by labeling them.
Review + Create for Event Hubs

Notes:
By default, Azure Event Hubs allows events to be retained for only a certain period of time, which can vary depending on the Event Hubs service plan.
For example, the Basic service plan retains data in an Event Hub for 1 day and then automatically deletes the data.
The Standard service plan, by default, retains data for 7 days, which can be configured between 1–7 days.
The Premium service plan, by default, retains data for 7 days, which can be configured between 1–90 days.
Definitions of selection for Event Hubs

Overview provides a general overview of your Event Hub, including its name, service plan, region, costs, and other features.
Activity Log keeps a record of your Event Hub’s activities. This section allows you to view all operations and changes made to your Event Hub.
Access control (IAM) allows you to manage the authentication and access controls for your Event Hub. You can control who manages your Event Hub and who can access it.
Tags allows you to add tags to your Event Hub resources, which can be used as references to categorize and manage your Event Hub resources.
Diagnose and Solve Problems section helps you identify and resolve issues with your Event Hubs service.
Events section provides an interface for visualizing and analyzing events in your Event Hubs service.
Shared Access Policies manages the permissions defined for the Event Hubs service.
Scale enables you to increase the performance and capacity of your Event Hubs service.
Geo-recovery is a method used to recover the Event Hubs service in the event of a disaster.
Encryption is the process of mathematically encrypting data or a message to prevent it from being understood or read.
Properties displays and allows you to modify various properties related to your Event Hub namespace or Event Hub.
Lock feature is a security feature used to prevent an Azure resource from being deleted or modified.
Entities section lists and manages the Event Hubs you created in your Event Hub namespace. You can create different entity types for each Event Hub to provide different authorizations for data collection and management processes.
Schema Group is a feature that allows you to configure and manage your data in your Event Hub namespace. It provides a management layer for creating and sharing a data schema.
Alerts are notifications sent automatically when specific events occur in your Azure Event Hub Namespace or Event Hubs.
Metrics are measurements that allow you to monitor and analyze the performance and usage of your Azure Event Hub Namespace or Event Hubs.
Diagnostic Settings is a feature that allows you to collect and manage various diagnostic data, such as performance, transaction logs, and monitoring information for your Azure Event Hub Namespace or Event Hubs.
Logs are records that contain various diagnostic data, such as transaction logs and monitoring data, for your Azure Event Hub Namespace or Event Hubs.
Automation Task is a feature that enables you to automate certain tasks in your Azure Event Hub Namespace or Event Hubs.
Export Template is a feature that allows you to export the source code and configuration of your Azure Event Hub Namespace or Event Hub in JSON format, allowing you to reuse the source code.
Resource Health is a feature that provides information about the health status of your Azure Event Hub Namespace or Event Hubs.
Event Hubs →Entities Event Hubs
Event Hubs, entities usually refer to the core components of the service, such as Event Hub instances, consumer groups, and partitions.

Configuration.

“Partition Count” refers to the number of partitions used to store and process data in an Event Hub.
“Retention” is a feature that determines how long data will be retained in an Event Hub.
“Cleanup Policy” determines when and how data stored in the Event Hub will be deleted once the retention period has ended. The “Delete” option specifies that data will be permanently deleted after the retention period, while the “Compact” option indicates that data will be compressed and only the latest data will be stored.
“Retention Time (hrs)” indicates how long data stored in an Event Hub will be retained. After the specified time period, data may be automatically deleted from the Event Hub.
Capture

When the “Capture” feature is configured as “On” in an Event Hub, data generated in the Event Hub is automatically captured and stored in the specified storage account. The “Capture” feature allows for historical data to be recorded and later used for analysis, reporting, or backup purposes.
If the “Capture” feature is configured as “Off,” data generated in the Event Hub will not be captured or stored.
Review + Create Entities Event Hubs

Entities Event Hubs →Shared Access Policies
“Shared Access Policies” refer to the policies that grant access to Event Hubs within an Event Hub Namespace. These policies allow specific users or applications to access the Event Hub and determine what operations they can perform.
The “Send” permission allows the relevant application to send messages to the Event Hub. This permission is necessary for applications or devices that record data in the Event Hub.
The “Listen” permission allows the relevant application to listen to messages from the Event Hub. This permission is necessary for applications or devices that read data from the Event Hub.
In addition, the “Manage” permission can also be determined by Shared Access Policies. The “Manage” permission allows the relevant user or application to manage the Event Hub.

Connection String:

A connection string in Azure Event Hubs is a string that contains all the necessary information for an application or service to connect to and interact with an Event Hub instance.
Azure Stream Analytics
Azure Stream Analytics is an Azure service used for processing large data streams. With this service, it is possible to analyze, process, and store streaming data in real-time.

When creating a Stream Analytics Job, the hosting environment option determines where your workload will run. The Cloud option means that your workload will be hosted in Azure, while the Edge option means that your workload will run on IoT Edge devices.
The Streaming Unit (SU) is a metric that determines the processing capacity of a Stream Analytics Job. A Streaming Unit represents the amount of CPU, memory, and network resources in each processing unit. Therefore, the more SUs a Stream Analytics Job has, the more processing capacity it has, and it can process data streams faster.
Azure Stream Analytics — Storage

Azure Stream Analytics can use an Azure Storage account to store processed data. The “Secure private data storage account” option allows you to create a specially configured Azure Storage account for your Stream Analytics Job to store data securely.
Review + Create Azure Streaming Analytics

Definitions of Job topology and Settings selection for Azure Stream Analytics

Job Topology
Topology provides a view of the basic components that you can use to configure your Stream Analytics Job. These components allow you to manage your Stream Analytics Job’s inputs, processes, and outputs, and process and store your data.
Inputs is contains the sources from which your Stream Analytics Job can receive input data. These sources can include Azure Event Hubs, Azure IoT Hub, Azure Blob storage, and other data sources. The Inputs feature allows you to determine where your input data will come from when creating your Stream Analytics Job.
Functions is contains the functions that you can use in your Stream Analytics Job queries. These functions can include mathematical operations, timing operations, conversion operations, stream operations, and other data processing operations. The Functions feature allows you to determine how to process your data when creating your Stream Analytics Job.
Query is contains the query that processes your Stream Analytics Job’s data. The query, which determines how your Stream Analytics Job processes data, is written using a SQL-like query language. The Query feature allows you to determine how to process your data when creating your Stream Analytics Job.
Outputs is contains the destinations to which your Stream Analytics Job can send processed data. These destinations can include Azure Blob storage, Azure SQL databases, Azure Event Hubs, and other data destinations. The Outputs feature allows you to determine where your processed data will be sent when creating your Stream Analytics Job.
Settings
Environment has two options that The Standard (Multi-tenant) and Dedicated (Stream Analytics Cluster). The Standard (Multi-tenant) option may be suitable for small workloads and low data volumes, while the Dedicated (Stream Analytics Cluster) option may be more appropriate for larger workloads and high data volumes.
Storage is allows you to configure the storage account used by the Stream Analytics Job. The “Scale” option in the Azure Stream Analytics Job’s Settings section allows you to set scaling settings for running your Stream Analytics Job.
Locale is allows you to set the language settings for the region where your Stream Analytics Job will run. This option enables you to adjust language and other localization settings used in the region where your Stream Analytics Job will run, such as date and time formats, currencies, and measurement units.
Event ordering is allows you to specify how the Stream Analytics Job should handle data ordering while processing data.
Error Policy is allows you to specify how the Stream Analytics Job should behave in case of errors.
Compatibility Level is determines how compatible your Stream Analytics Job will be with older versions.
Azure Stream Analytics Input
The created Event Hubs is added to the input.


Azure Stream Analytics Output
The output is added for Power BI.

Azure Stream Analytics Query
Azure Stream Analytics Query, Azure Stream Analytics service within Microsoft Azure that allows you to process and analyze real-time data streams using a SQL-like query language.

Python Script for Real — Time Currency APİ Data
Before you start Python Script. Azure Stream Analytics should be started.

# Import necessary libraries/modules
# This library is used for making HTTP requests to a web API.
import requests
# This library is used for working with JSON data.
import json
# This library is used for working with data in a tabular format.
import pandas as pd
# This library is used for working with dates and times.
import datetime
# This library is used for sending data to Azure Event Hubs asynchronously.
from azure.eventhub.aio import EventHubProducerClient
# This library is used for representing data that is being sent to Azure Event Hubs.
from azure.eventhub import EventData
# This library is used for handling exceptions that may occur when sending data to Azure Event Hubs.
from azure.eventhub.exceptions import EventHubError
# This library is used for asynchronous programming in Python.
import asyncio
# This is a custom library used for accessing the exchange rate API.
import rate_api
Connecting Event Hubs as connection string.

# This variable going to be used to store the connection string to an Azure Event Hub.
connection_str="Your connection str"
# This variable going to be used to store the name of the Azure Event Hub.
eventhub_name ="Your entities->eventhub name"
# Your api key from exchangerate
api_key = rate_api.api
Example of How to fetch Euro Currency with Python Scripts and Scripts meanings.
# This function retrieves the latest exchange rates for the Euro currency from an API and returns the data in a Pandas DataFrame.
def exchangerates_eur():
# The URL for the API endpoint that provides the exchange rates.
url = "https://api.apilayer.com/exchangerates_data/latest"
# Parameters to send to the API endpoint.
params = {"symbols": "TRY,ARS,BRL,EUR,GBP,USD", "base": "EUR"}
# Headers to include in the API request for authentication.
headers = {"apikey": api_key}
# Send a GET request to the API endpoint with the specified parameters and headers.
response = requests.get(url, params=params, headers=headers)
# Convert the JSON response to a Python dictionary.
response_json = response.json()
# Get the base currency and date of the exchange rate data.
base = response_json['base']
date = response_json['date']
# Get the exchange rates for each currency and store them in a Pandas DataFrame.
rates = response_json['rates']
api_date = datetime.datetime.fromtimestamp(response_json["timestamp"]).strftime('%Y-%m-%d %H:%M:%S')
df = pd.DataFrame.from_dict(response_json["rates"], orient="index", columns=["Rate"])
df.reset_index(inplace=True)
df.rename(columns={"index": "Currency"}, inplace=True)
df["Exchange Rate"] = "EUR/" + df["Currency"]
df["Api_date"] = api_date
df = df[[ "Exchange Rate","Rate","Api_date"]]
# Return the Pandas DataFrame with the exchange rate data.
return df
All of TRY, ARS, BRL, EUR, GBP, USD based on EUR, GBP, USD for Python Scripts.
def main():
def exchangerates_eur():
url = "https://api.apilayer.com/exchangerates_data/latest"
params = {"symbols": "TRY,ARS,BRL,EUR,GBP,USD", "base": "EUR"}
headers= {"apikey": api_key}
response = requests.get(url, params=params, headers=headers)
response_json=response.json()
base = response_json['base']
date = response_json['date']
rates = response_json['rates']
api_date = datetime.datetime.fromtimestamp(response_json["timestamp"]).strftime('%Y-%m-%d %H:%M:%S')
df = pd.DataFrame.from_dict(response_json["rates"], orient="index", columns=["Rate"])
df.reset_index(inplace=True)
df.rename(columns={"index": "Currency"}, inplace=True)
df["Exchange Rate"] = "EUR/" + df["Currency"]
df["Api_date"] = api_date
df = df[[ "Exchange Rate","Rate","Api_date"]]
return df
def exchangerates_gbp():
url = "https://api.apilayer.com/exchangerates_data/latest"
params = {"symbols": "TRY,ARS,BRL,EUR,GBP,USD", "base": "GBP"}
headers= {"apikey": api_key}
response = requests.get(url, params=params, headers=headers)
response_json=response.json()
base = response_json['base']
date = response_json['date']
rates = response_json['rates']
api_date = datetime.datetime.fromtimestamp(response_json["timestamp"]).strftime('%Y-%m-%d %H:%M:%S')
df = pd.DataFrame.from_dict(response_json["rates"], orient="index", columns=["Rate"])
df.reset_index(inplace=True)
df.rename(columns={"index": "Currency"}, inplace=True)
df["Exchange Rate"] = "GBP/" + df["Currency"]
df["Api_date"] = api_date
df = df[[ "Exchange Rate","Rate","Api_date"]]
return df
def exchangerates_usd():
url = "https://api.apilayer.com/exchangerates_data/latest"
params = {"symbols": "TRY,ARS,BRL,EUR,GBP,USD", "base": "USD"}
headers= {"apikey": api_key}
response = requests.get(url, params=params, headers=headers)
response_json=response.json()
base = response_json['base']
date = response_json['date']
rates = response_json['rates']
api_date = datetime.datetime.fromtimestamp(response_json["timestamp"]).strftime('%Y-%m-%d %H:%M:%S')
df = pd.DataFrame.from_dict(response_json["rates"], orient="index", columns=["Rate"])
df.reset_index(inplace=True)
df.rename(columns={"index": "Currency"}, inplace=True)
df["Exchange Rate"] = "USD/" + df["Currency"]
df["Api_date"] = api_date
df = df[[ "Exchange Rate","Rate","Api_date"]]
return df
# Call the 'exchangerates_usd' function and store the resulting DataFrame in 'df_usd'
df_usd = exchangerates_usd()
# Call the 'exchangerates_eur' function and store the resulting DataFrame in 'df_eur'
df_eur = exchangerates_eur()
# Call the 'exchangerates_gbp' function and store the resulting DataFrame in 'df_gbp'
df_gbp = exchangerates_gbp()
# Concatenate the three DataFrames (df_usd, df_eur, df_gbp) into a single DataFrame called 'result'
result = pd.concat([df_usd, df_eur, df_gbp])
# Convert the 'result' DataFrame into a list of dictionaries, with each dictionary representing a row
# The keys of the dictionaries will be the column names in the DataFrame
records = result.to_dict(orient='records')
return records
Every 60 seconds sending data to Event Hubs.
main()
async def run():
await asyncio.sleep(60)
producer = EventHubProducerClient.from_connection_string(
conn_str=connection_str, eventhub_name=eventhub_name
)
async with producer:
# Create a batch.
event_data_batch = await producer.create_batch()
# Add events to the batch.
event_data_batch.add(EventData(json.dumps(main())))
# Send the batch of events to the event hub.
await producer.send_batch(event_data_batch)
print("Success sent to azure event hubs")
while True:
await run()
Output in Power BI Service
When we run the python script we can see a dataset which we created currencydataset in Azure output Power BI. Therefore we can create a report for visualization.

The report is shown Visualization for TR(Filtered in Power BI)Currency. When the object is pinning to dashboard that will be live.
