Monthly Archives:January 2019

<
ByHariharan Rajendran

Dataset Refresh in PowerApps Custom visual

As we know, Power BI can be integrated with Power BI. Check the below approaches.

  1. In Power Apps, we can use Power BI tiles
  2. In Power BI, we can use Power Apps custom visual.

 

This article explains the latest update to the Power Apps custom visual. Let us take a scenario.

I need to embed the Power Apps with in Power BI report. Whenever an item is added through PowerApps, I need to refresh the dataset used for the Power BI report.

For the above scenario, we need to use Microsoft Flow to achieve the result. In Microsoft flow, we need to have custom connector to refresh the Power BI dataset.

From the latest update from PowerApps custom visual, we no need to go to the above step. I mean, we no need to use Microsoft flow to refresh the dataset. We have refresh method added with Power BI integration data source itself.

It is a very good feature for Power BI + PowerApps scenarios.

ByHariharan Rajendran

DAX Group by and Filter

This post explains you that how DAX can be used for Group by and Filter logic. To explain this scenario, I am taking an example data.

Example Data:

I have a following table.

EmpID EmpDesignation EmpDepartment DesigFrom
1 A IT – Finance 01/02/2018
1 SA IT – Sales 05/08/2018
2 M Marketing 05/07/2018
2 SM HR 25/08/2018
3 A Sales 06/06/2018

 

Requirement

As per the requirement, I need to perform below actions.

  1. Date Slicer should be available to filter From and To date
  2. Based on the filter, a table of figure should display the latest designation.

Expected Result

Filter – Start Date: 01/02/2018 to 31/08/2018

EmpID EmpDesignation MostEffective
1 SA 05/08/2018
2 SM 25/08/2018
3 A 06/06/2018

 

To achieve the above result, we need to create a DAX script. Use the below DAX script

MosftEffectiveDate = CALCULATE(

MAX(Employee[DesigFrom]),

Filter(Employee, Employee[EmpDesignation] = EARLIER( Employee[EmpDesignation])))

 

Sometimes, you may receive error due to Earlier function. Check the another version of the script without Earlier.

MosftEffectiveDate =

VAR Desig = MAX(Employee[EmpDesignation])

RETURN

CALCULATE(

MAX(Employee[DesigFrom]),

Filter(Employee, Employee[EmpDesignation]=Desig))

 

ByHariharan Rajendran

Python Script to Event Hub – Power BI Sentiment Analysis ML Function

This article explains the python script to send the data to event hub.

As we know, event hub is azure service which is designed to capture the streaming events from different devices or applications.

The below python script sends the sentiment string to event hub.

import sys

import logging

import datetime

import time

import os

import itertools

from azure.eventhub import EventHubClient, Sender, EventData

 

logger = logging.getLogger(“azure”)

 

ADDRESS = “amqps://<eventhubworkspacename>.servicebus.windows.net/<eventhubname>”

 

# SAS policy and key are not required if they are encoded in the URL

USER = “RootManageSharedAccessKey”

KEY = “<Key>”

 

try:

    if not ADDRESS:

        raise ValueError(“No EventHubs URL supplied.”)

 

    # Create Event Hubs client

    client = EventHubClient(ADDRESS, debug=False, username=USER, password=KEY)

    sender = client.add_sender(partition=”0″)

    client.run()

   

    try:

        start_time = time.time()

        for i in range(1):

            print(“Sending message: {}”.format(i))

            a=”[{“+”ts”+”:”+str(1550321522)+”,”+”Message”+”:”+”\”good\””+”,”+”TestVal”+”:”+str(7)+”}]”

           

       

            print(a)

            sender.send(EventData((a)))

                     

           

    except:

        raise

    finally:

        end_time = time.time()

        client.stop()

        run_time = end_time – start_time

        logger.info(“Runtime: {} seconds”.format(run_time))

 

except KeyboardInterrupt:

    pass

 

 

Applications

The above script sends the data to event hub which can use used as an input to the AZ ML function in Stream Analytics.

The output of the stream analytics query can be used for Power BI streaming dataset.

 

ByDr. SubraMANI Paramasivam

Microsoft Flow with Face API

This article explains how to handle the azure cognitive service APIs within Microsoft Flow. Microsoft Flow team has released new connectors for Azure cognitive service API which are in preview now. It includes Computer Vision and Face API.

Each connector has a different set of actions. We can use those actions by passing the proper input to the connections.

To make it clear, I am explaining a scenario with Face API in Microsoft Flow. In this, I will explain how you can process the “Detect Faces “action and store the result on on-premises SQL Server table.

Requirements

  1. Face API URL & Key
  2. On-Premises Data Gateway – SQL Server
  3. Microsoft Flow – Free subscription or O365 subscription

Creating Face API\

To create a Face API, you need an Azure Subscription. If you don’t have a subscription, then you can get a free Azure subscription from here.

Visit portal.azure.com and click “Create a Resource”.

Under new, choose “Ai + Machine Learning” -> Face

Create a new face resource by providing the required details.

Once the resource is created, you need to get the key and URL (EndPoint).

Note down the endpoint and key and we will use it on Microsoft Flow.

On-Premises Data Gateway

As you know, Power BI can connect with on-premises data using on-premises data gateway. This gateway is not only for Power BI, it also for Logic Apps, Azure Analysis Services, Microsoft Flow and Power Apps. You can use the same data gateway to connect with on-premises data within Microsoft Flow.

On-premises SQL Server

You need to create two tables for this scenario.

Table 1 – It should hold the Image Path column. Example – https://www.sitename.com/image1.jpg

Table 2 – To store the API result. Use the below structure.

CREATE TABLE [dbo].[APIFaces](

       [id] [INT] IDENTITY(1,1) NOT NULL,

       [ImagePath] [NVARCHAR](MAX) NULL,

       [Gender] [NCHAR](10) NULL,

       [Glasses] [NVARCHAR](50) NULL,

       [Smile] [FLOAT] NULL,

 CONSTRAINT [PK_APIFaces] PRIMARY KEY CLUSTERED

(

       [id] ASC

) ON [PRIMARY]

)

GO

Microsoft Flow

You can create a free account on Microsoft Flow or if you have 0365 subscriptions then you will get flow by default as one of the features.

You can learn more about Microsoft flow here.

Follow the below steps.

As I mentioned, we are going to use SQL Server with Face API.

To create any flow, we need to set a trigger section. Here, I am using SQL Server as a trigger. SQL Server has 2 different trigger options, in that, I am using a trigger called “When an item is created”

Once added that trigger, you need to create and map the connection. When you click the “…” option on the right corner, you will get the form to fill the details to create a connection with your on-premises SQL Server.

Fill the required details and make sure the connection is created successfully.

If the connection is created successfully then you can see the tables list as like below otherwise you will get an error message.

The next step, add the Face API and choose the “Detect Face” action.

There also you need to create a connection with face API key and URL. You can provide any name to the connection name field.

Face API will ask you to provide the image URL.

You can easily choose the ImagePath from the dynamic content.

Next, add SQL Server and choose “Insert Row” action.

This time, you can use the same connection which you created above.

Select the table name. It will load the columns from the table. You need to map the dynamic content on each field.

Once all the fields are mapped then you can see the flow as same as like below. Sometimes, Apply to each condition will be added automatically.

The final flow would look like below. You save and test the flow.

You can check the flow history for flow status and check the result on SQL Server table.

ByDr. SubraMANI Paramasivam

Embed Face API Results in Power BI

As you know, the result of any APIs from Azure Cognitive services is a JSON file. The structure of the JSON file is not in a proper way to handle them effectively and easily inside Power BI.

In this article, I am explaining the easiest way to get the result in a proper way inside Power BI.

To accomplish the result, I am using a python script. As Power BI starts support python as one of the data sources we can easily pass the python script and get the API result.

Azure Cognitive services have a bunch of APIs and documentation and API reference for each API. Since I am using python script, I can easily get the python Face API reference and use it directly.

Requirements

  1. Python 3
  2. Power BI Desktop

Use the below python code. Update your Face API subscription key & url.

from urllib.request import urlopen

import json, os, io, requests

from io import BytesIO

import pandas as pd

subscription_key = "your_subscription_key"
base_url = "https://your_region.api.cognitive.microsoft.com/face/v1.0/"
detect_url=base_url+"detect"

headers    = {'Ocp-Apim-Subscription-Key': subscription_key,

              'Content-Type': 'application/octet-stream'}

params     = {'returnFaceId': 'true',

    'returnFaceLandmarks': 'false',

    'returnFaceAttributes': 'age,gender,smile,facialHair,headPose,glasses,emotion,hair,makeup,accessories,blur,exposure,noise'}

Image_Path="https://img.etimg.com/thumb/msid-61020784,width-643,imgsize-228069,resizemode-4/3-lessons-that-satya-nadella-took-from-the-cricket-field-to-the-ceos-office.jpg"

with urlopen(Image_Path) as url:

    image_data = io.BytesIO(url.read())
     
    response = requests.post(

          detect_url, headers=headers, params=params, data=image_data)
 
    face=json.loads(response.content)

    smile= [face[0]['faceAttributes']['smile']]

    gender = [str(face[0]['faceAttributes']['gender'])]

    age= [face[0]['faceAttributes']['age']]

    glass=[str(face[0]['faceAttributes']['glasses'])]

    anger=[face[0]['faceAttributes']['emotion']['anger']]

    contempt=[face[0]['faceAttributes']['emotion']['contempt']]

    disgust=[face[0]['faceAttributes']['emotion']['disgust']]

    fear=[face[0]['faceAttributes']['emotion']['fear']]

    happy=[face[0]['faceAttributes']['emotion']['happiness']]

    neutral = [face[0]['faceAttributes']['emotion']['neutral']]

    sad=[face[0]['faceAttributes']['emotion']['sadness']]

    surprise=[face[0]['faceAttributes']['emotion']['surprise']]

    eyemakeup=[face[0]['faceAttributes']['makeup']['eyeMakeup']]

    lipmakeup=[face[0]['faceAttributes']['makeup']['lipMakeup']]

    bald=[face[0]['faceAttributes']['hair']['bald']]

    haircolor=[face[0]['faceAttributes']['hair']['hairColor']]

    face_ds = pd.DataFrame({

        "smile": smile,

        "gender":gender,

        "age":age,

        "glass":glass,

        "anger":anger,

        "contempt":contempt,

        "disgust":disgust,

        "fear":fear,

        "happy":happy,

        "neutral":neutral,

        "sad":sad,

        "surprise":surprise,

        "eyemakeup":eyemakeup,

        "lipmakeup":lipmakeup,

        "bald":bald,

        "haircolor":haircolor

    })

You can test the above code on your python IDE and can see the result which will be in a table format.

Power BI Desktop Report

Follow the below steps.

Open Power BI Desktop and choose “Python script” as a data source.

Copy and paste the above code on the editor window.

Click ok and it will load and display the table as like below.

Load the data and you can use those fields on your report.

As of now, the image path is hardcoded by you can dynamically pass it by using the parameters.

The sample look and feel of the report.

ByDr. SubraMANI Paramasivam

Custom Vision API – Train and Test (No Coding is Required)

Custom vision is one of the API from Azure cognitive services. It is coming under vision category. We have a bunch of API under vision which are like a pre-built one and we can use them inside our application without modifying the algorithms.

In case, if we want to create our own vision API, we can start using the custom vision API. It has the capability to train your model and publish it. Like other APIs, you can easily integrate with your application. In simple terms, you have the control to the API start from train, test and publish.

Follow the below steps to create a custom vision API.

Visit, https://www.customvision.ai/

If you don’t have an account, then you can easily signup and get an account.

Once you logged in then you can create a new project.

Provide the name and category of the project that you want to start. Fill the details and click create a project.

Once the project is created then you can see the window as like below.

Scenario

As we are dealing with the vision API, we need to upload the images and tag them (group them). For example, if you are uploading some dog images and you want to test whether the new image is a dog then the system will say that is dog otherwise it will say it is not a dog. To achieve this solution, you need to upload different dog images and train the system.

Follow the below steps.

Click Add images button and upload all the image files as like below.

Once it is uploaded and click done.

While adding the images, you can tag them or tag later.

Select all the images and click “Tag Images” and tag them.

Once it is tagged then you can see the images under the tagged section.

Now click the “Train” button and train the model. It will take few seconds to train and you see the results. Also, you have the option to set up the probability threshold.

Now, click quick test button and upload some other image and see the outcome.

Embed Custom Vision API

Check settings page and you could see the training and prediction keys. Refer the below documentation to proceed further.

Ref: https://docs.microsoft.com/en-us/azure/cognitive-services/custom-vision-service/python-tutorial

 

 

ByHariharan Rajendran

Bing Image Search API – Python Script

This article explains how to use a python script to interact with Azure cognitive services – Bing Image Search.

Requirements

  1. Python 3 & libraries
  2. Bing Search API key – Azure Subscription

Bing Search API

Follow the below steps to create Bing Search API.

Step 1: Visit portal.azure.com

Step 2: Create a new resource – search for “Bing Search v7”

Step 3: Once created the API, you need to get API key. The base URL is same for everyone which is https://api.cognitive.microsoft.com/bing/v7.0/images/search

 

Python script

Use the below python script and replace the key.

import http.client, urllib.request, urllib.parse, urllib.error, base64

import requests, json

desc_url = ‘https://api.cognitive.microsoft.com/bing/v7.0/images/search’

headers = {

    # Request headers

    ‘Ocp-Apim-Subscription-Key’: ‘795bff7c586c43749d79a746dcebe12b’,

}

params = {

    # Request parameters

    ‘q’: ‘Hariharan Rajendran’,

    ‘count’: ‘5’,

    ‘offset’: ‘0’,

    ‘mkt’: ‘en-us’,

    ‘safeSearch’: ‘Moderate’,

}

desc_response = requests.get(

              desc_url, headers=headers, params=params)

desc=json.loads(desc_response.content)

print(desc)

You can change the parameter values to get the different result.

 

ByHariharan Rajendran

Raspberry Pi 3

This post helps you to identify the pins in raspberry pi.

In an earlier version of Raspberry Pi, the pin details are clearly specified on the board itself but it is not there on the latest version of Raspberry Pi.

To get the pin details you can type the below command on the console.

pinout

Running the above code will give the details as like below image.

I will be posting IoT related scripts and how Raspberry Pi can be integrated with Azure and Power BI.