Topics

Extract Time Series Insights data via REST API using Oauth2 authentication


Currently I am working on a project where we want to be able to extract data which has been ingested in Time Series Insights to check if it is also properly stored.

Unfortunately the user interface of Time Series Insights doesn't have the capability to accomplish this in a user friendly and efficient way. Adding all individual sensors to the grid is quite a cumbersome task and frankly not the most exciting job.

Fortunately Azure Time Seriens Insights offers a rich set of REST API's which can be used to extract data. For a full overview of availbale API's please consult the following page. Below I will explain which steps you could take to extract data from Time Series Insights using the Time Series Insights REST API's.

Prerequisite

You already have a Time Series Insights instance up and running and sensor values are written to Time Series Insights.

Steps

Create an Azure Service Principal name and grant Time Series Insights API permissions

When calling a REST API of Time Series Insights you need to provide the username and password of the Azure Service Principal which has the permissions to execute REST API commands. An Azure Service Principal is just an account in the Azure Active Directory specifically used and desgined for applications to authorize and authenticate. In the past we called these type of accounts a service account but in Azure we call them Service Principals.

Please consult the following page for detailled instructions how to create a Service Principal and how to assign the permissions to use the REST API's of Time Series Insights. I have created a new App, assigned a secret to the App and I have granted the App permissions to the Time Series Insights API which is described in the following sections on this page:

  • 'Step 1: Create your managed identity or app registration' -> 'Application registration'.
  • 'Step 2: Grant access'

Once completed you should have your client_id and client_secret available which you will need in the python script responsbile for extracting data.

Create a (databricks) python script and run the code below
Replace the text in bold red with the values for your environment.


import requests, json

#Set authorization url of time series insights
authorize_url = "https://api.timeseries.azure.com/"
#Set azure oauth2 url to get bearer token
token_url = "https://login.microsoftonline.com/TENTANTID/oauth2/token" 
#Set REST API endpoint of Time Series Insights to execute TSI query
query_tsi_url = "https://TIME_SERIES_INSIGHTS_FQDN/query?api-version=2020-07-31" #Set client id and client secret to get bearer token client_id = 'CLIENT_ID' client_secret = 'CLIENT_SECRET
#Set payload to send to get bearer authentication token
data = {'grant_type': 'client_credentials', 'client_id': client_id,'resource': authorize_url} 
#Send request to get bearer token
access_token_response = requests.post(token_url, data=data, verify=False, allow_redirects=False, auth=(client_id, client_secret)) #Get the token from the response and set the headers for the REST API call
tokens = json.loads(access_token_response.text) access_token = tokens['access_token'] api_call_headers = {'Authorization': 'Bearer ' + access_token} #Set the list of sensors for which to get data from
sensors = ['SENSOR_NAME_1',
'SENSOR_NAME_2']
#Send the REST API call for each sensor in the sensor list to get the data from TSI for sensor in sensors: query_payload = { "getEvents": { "timeSeriesId": [ sensor ], "searchSpan": { "from": '2021-03-15T00:00:00Z', "to": '2021-03-15T00:01:00Z' } } } api_call_response = requests.post(query_tsi_url, headers=api_call_headers, json=query_payload) if api_call_response.status_code==200: print(api_call_response.json())

When the script runs successfully you will get the value for each sensor in the given timespan in json format.

Power BI - Empty data source name and error refreshing azure datalake, and a power bi tip!

Issue
 
Recently I encountered some issues when loading data from an Azure Datalake Storage Account (ADLS) into Power BI.


Issue 1 
Refreshing the data source via Power BI service gives an error stating: "credentials are required to connect to the azuredatalakestorage source".

Solution
After contacting Microsoft Support it turned out that I used the wrong connection string, I used the container name in the connection string of Azure Datalake, e.g.: https://yourcontainername@yourdatalakename.dfs.core.windows.net which is incorrect. When connecting to an Azure Datalake please use the following connection string:
https://yourdatalakename.dfs.core.windows.net


Issue 2
The name of the Azure Datalake data source is blank when the Power BI file is published to Power BI service.

The name of the data source is blank as you can see in the screenshot below, the empty space just before the text "Referenties bewerken" should contain the name of the data source. In this case I would expect some value like: 
yourdatalakename.dfs.core.windows.net



This issue has been identified as a bug by Microsoft and they will work on a fix to solve this in a future release.

Tip
You can connect to a different Power BI environment (tenant) by adding the tenant id of the client in the url of Power BI service. Just simply enter the url below in the address bar of your browser and you will be asked to login to that tenant.

https://app.powerbi.com/home?ctid=tenant_id
Example: https://app.powerbi.com/home?ctid=123456789-1234-5678-cd98-12345eed123456

Of course you need an account which is authorized in the Azure Active Directory of the client's tenant, this could be an AAD account which is created by the client or your client could invite you with guest access so you can use your own AAD account.