Author Archives: Dr. SubraMANI Paramasivam

ByDr. SubraMANI Paramasivam

Microsoft Flow with Face API

This article explains how to handle the azure cognitive service APIs within Microsoft Flow. Microsoft Flow team has released new connectors for Azure cognitive service API which are in preview now. It includes Computer Vision and Face API.

Each connector has a different set of actions. We can use those actions by passing the proper input to the connections.

To make it clear, I am explaining a scenario with Face API in Microsoft Flow. In this, I will explain how you can process the “Detect Faces “action and store the result on on-premises SQL Server table.

Requirements

  1. Face API URL & Key
  2. On-Premises Data Gateway – SQL Server
  3. Microsoft Flow – Free subscription or O365 subscription

Creating Face API\

To create a Face API, you need an Azure Subscription. If you don’t have a subscription, then you can get a free Azure subscription from here.

Visit portal.azure.com and click “Create a Resource”.

Under new, choose “Ai + Machine Learning” -> Face

Create a new face resource by providing the required details.

Once the resource is created, you need to get the key and URL (EndPoint).

Note down the endpoint and key and we will use it on Microsoft Flow.

On-Premises Data Gateway

As you know, Power BI can connect with on-premises data using on-premises data gateway. This gateway is not only for Power BI, it also for Logic Apps, Azure Analysis Services, Microsoft Flow and Power Apps. You can use the same data gateway to connect with on-premises data within Microsoft Flow.

On-premises SQL Server

You need to create two tables for this scenario.

Table 1 – It should hold the Image Path column. Example – https://www.sitename.com/image1.jpg

Table 2 – To store the API result. Use the below structure.

CREATE TABLE [dbo].[APIFaces](

       [id] [INT] IDENTITY(1,1) NOT NULL,

       [ImagePath] [NVARCHAR](MAX) NULL,

       [Gender] [NCHAR](10) NULL,

       [Glasses] [NVARCHAR](50) NULL,

       [Smile] [FLOAT] NULL,

 CONSTRAINT [PK_APIFaces] PRIMARY KEY CLUSTERED

(

       [id] ASC

) ON [PRIMARY]

)

GO

Microsoft Flow

You can create a free account on Microsoft Flow or if you have 0365 subscriptions then you will get flow by default as one of the features.

You can learn more about Microsoft flow here.

Follow the below steps.

As I mentioned, we are going to use SQL Server with Face API.

To create any flow, we need to set a trigger section. Here, I am using SQL Server as a trigger. SQL Server has 2 different trigger options, in that, I am using a trigger called “When an item is created”

Once added that trigger, you need to create and map the connection. When you click the “…” option on the right corner, you will get the form to fill the details to create a connection with your on-premises SQL Server.

Fill the required details and make sure the connection is created successfully.

If the connection is created successfully then you can see the tables list as like below otherwise you will get an error message.

The next step, add the Face API and choose the “Detect Face” action.

There also you need to create a connection with face API key and URL. You can provide any name to the connection name field.

Face API will ask you to provide the image URL.

You can easily choose the ImagePath from the dynamic content.

Next, add SQL Server and choose “Insert Row” action.

This time, you can use the same connection which you created above.

Select the table name. It will load the columns from the table. You need to map the dynamic content on each field.

Once all the fields are mapped then you can see the flow as same as like below. Sometimes, Apply to each condition will be added automatically.

The final flow would look like below. You save and test the flow.

You can check the flow history for flow status and check the result on SQL Server table.

ByDr. SubraMANI Paramasivam

Embed Face API Results in Power BI

As you know, the result of any APIs from Azure Cognitive services is a JSON file. The structure of the JSON file is not in a proper way to handle them effectively and easily inside Power BI.

In this article, I am explaining the easiest way to get the result in a proper way inside Power BI.

To accomplish the result, I am using a python script. As Power BI starts support python as one of the data sources we can easily pass the python script and get the API result.

Azure Cognitive services have a bunch of APIs and documentation and API reference for each API. Since I am using python script, I can easily get the python Face API reference and use it directly.

Requirements

  1. Python 3
  2. Power BI Desktop

Use the below python code. Update your Face API subscription key & url.

from urllib.request import urlopen

import json, os, io, requests

from io import BytesIO

import pandas as pd

subscription_key = "your_subscription_key"
base_url = "https://your_region.api.cognitive.microsoft.com/face/v1.0/"
detect_url=base_url+"detect"

headers    = {'Ocp-Apim-Subscription-Key': subscription_key,

              'Content-Type': 'application/octet-stream'}

params     = {'returnFaceId': 'true',

    'returnFaceLandmarks': 'false',

    'returnFaceAttributes': 'age,gender,smile,facialHair,headPose,glasses,emotion,hair,makeup,accessories,blur,exposure,noise'}

Image_Path="https://img.etimg.com/thumb/msid-61020784,width-643,imgsize-228069,resizemode-4/3-lessons-that-satya-nadella-took-from-the-cricket-field-to-the-ceos-office.jpg"

with urlopen(Image_Path) as url:

    image_data = io.BytesIO(url.read())
     
    response = requests.post(

          detect_url, headers=headers, params=params, data=image_data)
 
    face=json.loads(response.content)

    smile= [face[0]['faceAttributes']['smile']]

    gender = [str(face[0]['faceAttributes']['gender'])]

    age= [face[0]['faceAttributes']['age']]

    glass=[str(face[0]['faceAttributes']['glasses'])]

    anger=[face[0]['faceAttributes']['emotion']['anger']]

    contempt=[face[0]['faceAttributes']['emotion']['contempt']]

    disgust=[face[0]['faceAttributes']['emotion']['disgust']]

    fear=[face[0]['faceAttributes']['emotion']['fear']]

    happy=[face[0]['faceAttributes']['emotion']['happiness']]

    neutral = [face[0]['faceAttributes']['emotion']['neutral']]

    sad=[face[0]['faceAttributes']['emotion']['sadness']]

    surprise=[face[0]['faceAttributes']['emotion']['surprise']]

    eyemakeup=[face[0]['faceAttributes']['makeup']['eyeMakeup']]

    lipmakeup=[face[0]['faceAttributes']['makeup']['lipMakeup']]

    bald=[face[0]['faceAttributes']['hair']['bald']]

    haircolor=[face[0]['faceAttributes']['hair']['hairColor']]

    face_ds = pd.DataFrame({

        "smile": smile,

        "gender":gender,

        "age":age,

        "glass":glass,

        "anger":anger,

        "contempt":contempt,

        "disgust":disgust,

        "fear":fear,

        "happy":happy,

        "neutral":neutral,

        "sad":sad,

        "surprise":surprise,

        "eyemakeup":eyemakeup,

        "lipmakeup":lipmakeup,

        "bald":bald,

        "haircolor":haircolor

    })

You can test the above code on your python IDE and can see the result which will be in a table format.

Power BI Desktop Report

Follow the below steps.

Open Power BI Desktop and choose “Python script” as a data source.

Copy and paste the above code on the editor window.

Click ok and it will load and display the table as like below.

Load the data and you can use those fields on your report.

As of now, the image path is hardcoded by you can dynamically pass it by using the parameters.

The sample look and feel of the report.

ByDr. SubraMANI Paramasivam

Custom Vision API – Train and Test (No Coding is Required)

Custom vision is one of the API from Azure cognitive services. It is coming under vision category. We have a bunch of API under vision which are like a pre-built one and we can use them inside our application without modifying the algorithms.

In case, if we want to create our own vision API, we can start using the custom vision API. It has the capability to train your model and publish it. Like other APIs, you can easily integrate with your application. In simple terms, you have the control to the API start from train, test and publish.

Follow the below steps to create a custom vision API.

Visit, https://www.customvision.ai/

If you don’t have an account, then you can easily signup and get an account.

Once you logged in then you can create a new project.

Provide the name and category of the project that you want to start. Fill the details and click create a project.

Once the project is created then you can see the window as like below.

Scenario

As we are dealing with the vision API, we need to upload the images and tag them (group them). For example, if you are uploading some dog images and you want to test whether the new image is a dog then the system will say that is dog otherwise it will say it is not a dog. To achieve this solution, you need to upload different dog images and train the system.

Follow the below steps.

Click Add images button and upload all the image files as like below.

Once it is uploaded and click done.

While adding the images, you can tag them or tag later.

Select all the images and click “Tag Images” and tag them.

Once it is tagged then you can see the images under the tagged section.

Now click the “Train” button and train the model. It will take few seconds to train and you see the results. Also, you have the option to set up the probability threshold.

Now, click quick test button and upload some other image and see the outcome.

Embed Custom Vision API

Check settings page and you could see the training and prediction keys. Refer the below documentation to proceed further.

Ref: https://docs.microsoft.com/en-us/azure/cognitive-services/custom-vision-service/python-tutorial

 

 

ByDr. SubraMANI Paramasivam

Empowering Every Person in the planet, with “Awareness Enabled Reports”

Following Satya’s quote “Empowering Every Person in the Planet”, in 2016 Microsoft Inspire conference, have certainly left my mind with unstoppable beats. I certainly understood what Satya meant, but I still thought about doing something like this, with my own upgraded version of Data Awareness Programme (which I started in 2014). But at that time, all I had was Power View, Power Map, Power Pivot as Excel add-ons and tried my best, to do the awareness programmes in remote villages, by mingling with the villagers and collecting their own data and showing some visuals back to them. I did this to improve their life style, find more time for personal and have better earnings. Though the main target was students, I hoped this message would spread to their friends, family and others in the remote places.

Following the release of full version of Power BI, I now have a fully working site, with living “Awareness Enabled Reports”, from sleeping open data sources (taken from various Gov/Non-Gov sites).

I managed to get this far, with a simple equation of A + B + C + D = E (EmpoweringEveryPerson.com (EEP)). Let me explain this in detail.

Following this, I managed to extract data from various open data sources and identified the most global challenges, that we are encountering and/or going to be a major threat in near future.

With our all time favourite reporting tool, “Microsoft Power BI“, published some “Awareness Enabled Reports” to www.EmpoweringEveryPerson.com site and categorized them with regional, national and global challenges, for easy manoeuvring within the EEP site.

 

This EEP site currently has 3 simple goals.

  1. View the “Awareness Enabled Reports” worldwide in any devices, by categorizing Regional or Global challenges.
  2. Submitting another “Awareness Enabled Report” as a Developer with some guidance. Also listed, Global Challenge Topics to select from open data sources and provided some tips to convince the selection committee and finally submitting the story.

  1. Promote EEP
    1. Provided options to Promote EEP as  a Developer, User Group Leader & End user.

Below screenshot shows categorization of reports by UK.

 

PROMOTE & PARTICIPATE

As per above equation, ‘D’ is the support that we need from you, to promote in any of the following ways.

1. DEVELOPERS

Develop “Awareness Enabled Reports” for all listed Global Challenge topics and submit your data story to EEP site and once published, tweet / share / post in social media and spread the awareness.

2. COMMUNITY LEADERS

A request to all Community group leaders, to spend at least a minute by starting or ending your user / local / online group sessions by introducing / re-introducing, this website and showcasing the opportunity to all attendees, to build and submit their own data story with “Awareness Enabled Reports“.

3. END USERS / VOLUNTEERS

Every time you see a new “Awareness Enabled Report“, do tweet / share / post in social media and support to spread the awareness.

 

Thanks in advance for your support and thanks for your time reading through this far, to create awareness with “Awareness Enabled Reports“.

ByDr. SubraMANI Paramasivam

Run Your Python Script

Use the below console to run your python scripts

# Assign value to the variable # a = 5 #Print
Print
ByDr. SubraMANI Paramasivam

Try SQL Server 2016 Release Candidate 3 in Azure Virtual Machine today

Don’t waste your monthly credits and if you have some you can always spin it up with Azure virtual machines to try the SQL Server 2016 RC3 Evaluation on Windows Server 2012 R2.

This is a great and quick fun as you don’t need to manually download the ISO file and keep installing them like I mentioned in this article with step by step instructions.

As you can see in the below Azure portal screen, I am using New => Virtual Machine and then clicked on “See all”, to see all available options and also to search on what I need to search for. The prebuilt images are already available for all of us to try and you can see the highlights that I have made.

SQL2016RC3_Azure_01

Then I got the below screen to confirm on what I was going to create and enable me to create it either with Resource Manager or Classis Deployment Model. In newer version Resource Manager is chosen as default for all deployment models.

SQL2016RC3_Azure_02

In below screen I have given basic options  with Name, username, my subscription, resource group, location. Once you have chosen all your settings till 5, then you will get your fully functional SQL Server 2016 RC 3 virtual machine in 10 to 15 minutes. How cool is that.

SQL2016RC3_Azure_03

ByDr. SubraMANI Paramasivam

Step by step instruction on installation of SQL Server 2016 RC3

Following the download of my ISO file from my earlier article, I am going to show you the step by step instructions on installation of SQL Server RC3.

Step 1 is to run the exe file and you get to the Planning tab. Below screenshot, shows the options available in next Installation tab. You can then click on first option which is “New SQL Server stand-alone installation or add features to an existing installation”.

Install_SQL2016RC3_01

Step 2: Here you have option to evaluate the free edition or enter the product key that you purchased already. Here I chose the Evaluation which is the default screen and then I clicked on next.

Install_SQL2016RC3_02

Step 3: In the License Terms screen, you have full terms that you can read through to accept the license from Microsoft before start installing.

Install_SQL2016RC3_03

Step 4: I got error message in the product updates section, as I did not had internet connection in my virtual machine as I had problems in bridging the connection during this setup process. You can click on “Check again” button if you have got the same error screen and still have the internet connection. Then you can click on Next or you can do the product updates later and you can directly click on Next button.

Install_SQL2016RC3_04

Step 5: Then the system starts installing the necessary setup files  and rules. In my below screen you can see the warning messages that I got. As long as it is not failing, we are good to go. So click on Next.

Install_SQL2016RC3_05

Step 6: Here you will get the options to choose the features. As I am doing main testing on R, SSRS, Polybase, Analysis services, I have ensured to check them all here. You can check/uncheck based on what you need to install.

Install_SQL2016RC3_06

Step 7: Here I got one task failed for not having Oracle JRE 7 Update 51 to support the polybase installation. After installing this seperately, I was able to continue my installation. Note: till this setup is not installed, the Next button won’t be enabled. Once I have this installation done, I made sure to click on “Re-run” button to test the latest and then this enables me to click on Next button. Here, no need to cancel and then no need to go through all above 6 steps.

Install_SQL2016RC3_07

Step 8: Then I get the option to configure the name of the instance and I chose it to reflect the RC3 here.

Install_SQL2016RC3_08

Step 9: As this is the setup in my standalone laptop, I chose the SQL Server as standalone polybase enabled instance.

Install_SQL2016RC3_09

Step 10: In Server configuration section, it shows all the relevant account names that will be created and the type of startup services (Manual or Automatic). When it is set to automatic, this will then start the services whenever you system is booted. If it is Manual, then you have to start it manually via the services or configuration manager.

Install_SQL2016RC3_10

Step 11: In the database engine configuration, I have chosen to opt for Mixed Mode to keyin my preferable password, and this also helps me to recover the instance if my current user gone cranky. Also I have added the current user YSMUser1 from the “Add Current User” button. I also have option to choose the data directories, tempdb and filestream.

In here, you also have the new feature of trying to have more than one tempdb and you can get it configured in this place.

Install_SQL2016RC3_11

Step 12: In Analysis Services Configuration tab, I have chosen to use the Multidimensional and Data Mining server mode and also added the current user to get admin rights.

Install_SQL2016RC3_12

Step 13: In reporting services configuration, I then have option to choose whether to install and configure or just install only. You still can configure at the later stage, however in regards to Reporting services, I always choose to get it configured as well as you can see below.

Install_SQL2016RC3_13

Step 14: For the distributed replay controller, I again choose my current user to give the admin permissions to manage it.

Install_SQL2016RC3_14

Step 15: Then you get to the place where you have option to choose where your controller name, working and result directories should be placed. I have not made any changes here and left it to reflect the default installation folder.

Install_SQL2016RC3_15

Step 16: This is very new specifically for SQL Server 2016 and you have to accept to download Microsoft R Open and install and also to agree to accept patches and updates for the software. This feature enables us to work on R side from Microsoft itself.

Install_SQL2016RC3_16

Step 17: I then get this screen to choose the installation path for the Microsoft R Open and Microsoft R Server.

Install_SQL2016RC3_17

Step 18: I have provided below install path for Microsoft R Open and Microsoft R Server to be installed.

Install_SQL2016RC3_18

Step 19: This is the last part where you can see the summary of installation on what necessary information you have provided to verify before installation. You still have option to go back to any configuration and make changes and then click on Next.

Install_SQL2016RC3_19

Step 20: This is the final screen where it says the installation was successful and advise to restart to complete the installation process.

Install_SQL2016RC3_20

Have good findings in your SQL Server 2016 journey.