This post explains the steps that you need to perform to find the minimum or maximum value from a list of values or a field from a table in Microsoft flow
I am explaining a simple procedure to achieve the result.
1. Take the first number and store it on a variable
2. Compare that variable to all the values from the field
3. Check the Maximum (greater than) value – If else condition
I am taking google sheet which has the column called “Sales”. It has some value. I need to find the maximum value.
Follow the below steps in Microsoft flow
Step 1 – Use any trigger
Step 2 – Action – “Get Rows – Google Sheet” and select the appropriate sheet.
Step 3 – Action – “Select”
From : “Record value”
Map : “Sales” => Sales
Step 4- Action – “Initialize Variable”
Name : “MaximumSales”
Step 5 – Action – “Apply to each “
Select an output from previous steps: Records value
i. int(items(‘Apply_to_each’)?[‘Sales’]) “is greater than” “MaximumSales”
ii. If Yes
Step 6 – Action – “Compose”
Inputs : MaximumSales
That’s it. we got the maximum value. Stay tuned for the use case with this logic.
This article explains how to handle the azure cognitive service APIs within Microsoft Flow. Microsoft Flow team has released new connectors for Azure cognitive service API which are in preview now. It includes Computer Vision and Face API.
Each connector has a different set of actions. We can use those actions by passing the proper input to the connections.
To make it clear, I am explaining a scenario with Face API in Microsoft Flow. In this, I will explain how you can process the “Detect Faces “action and store the result on on-premises SQL Server table.
Creating Face API\
To create a Face API, you need an Azure Subscription. If you don’t have a subscription, then you can get a free Azure subscription from here.
Visit portal.azure.com and click “Create a Resource”.
Under new, choose “Ai + Machine Learning” -> Face
Create a new face resource by providing the required details.
Once the resource is created, you need to get the key and URL (EndPoint).
Note down the endpoint and key and we will use it on Microsoft Flow.
On-Premises Data Gateway
As you know, Power BI can connect with on-premises data using on-premises data gateway. This gateway is not only for Power BI, it also for Logic Apps, Azure Analysis Services, Microsoft Flow and Power Apps. You can use the same data gateway to connect with on-premises data within Microsoft Flow.
On-premises SQL Server
You need to create two tables for this scenario.
Table 1 – It should hold the Image Path column. Example – https://www.sitename.com/image1.jpg
Table 2 – To store the API result. Use the below structure.
CREATE TABLE [dbo].[APIFaces]( [id] [INT] IDENTITY(1,1) NOT NULL, [ImagePath] [NVARCHAR](MAX) NULL, [Gender] [NCHAR](10) NULL, [Glasses] [NVARCHAR](50) NULL, [Smile] [FLOAT] NULL, CONSTRAINT [PK_APIFaces] PRIMARY KEY CLUSTERED ( [id] ASC ) ON [PRIMARY] ) GO
You can create a free account on Microsoft Flow or if you have 0365 subscriptions then you will get flow by default as one of the features.
You can learn more about Microsoft flow here.
Follow the below steps.
As I mentioned, we are going to use SQL Server with Face API.
To create any flow, we need to set a trigger section. Here, I am using SQL Server as a trigger. SQL Server has 2 different trigger options, in that, I am using a trigger called “When an item is created”
Once added that trigger, you need to create and map the connection. When you click the “…” option on the right corner, you will get the form to fill the details to create a connection with your on-premises SQL Server.
Fill the required details and make sure the connection is created successfully.
If the connection is created successfully then you can see the tables list as like below otherwise you will get an error message.
The next step, add the Face API and choose the “Detect Face” action.
There also you need to create a connection with face API key and URL. You can provide any name to the connection name field.
Face API will ask you to provide the image URL.
You can easily choose the ImagePath from the dynamic content.
Next, add SQL Server and choose “Insert Row” action.
This time, you can use the same connection which you created above.
Select the table name. It will load the columns from the table. You need to map the dynamic content on each field.
Once all the fields are mapped then you can see the flow as same as like below. Sometimes, Apply to each condition will be added automatically.
The final flow would look like below. You save and test the flow.
You can check the flow history for flow status and check the result on SQL Server table.