Monthly Archives:August 2015

<
ByHariharan Rajendran

Recover & Restore Deleted Power BI Workspace (Step by Step)

In recent times, I encountered one of the most common questions is how to recover the Power BI workspace and how to restore the deleted workspace in Power BI?.

I thought it would be very helpful if I write a blog post to answer those questions. This content is applicable for Power BI new workspaces.

Recover and Restore are two different concepts. Let us try to understand the difference first.

Restore – If you deleted an object which you want to get it back

Recover – The live object went to not responding stage or orphaned stage and getting that object back to live.

In Power BI,  Most of the time, you will encounter restore scenarios.

To perform the restore and recover operations, you need admin rights in Power BI.

Restore Operation

Let us consider a scenario, one of your team member working on maintenance activity and unfortunately, he deleted one of the business app workspaces as an admin you want to restore it back.

Follow the below steps

Step 1: Log in to app.powerbi.com and head over to admin portal

Step 2: Check for workspaces section

Step 3: Check for the workspace on the list and check the State of the workspace.

Step 4: Choose the deleted workspace

Step 5: You can see an option called “Restore” on the top. Click Restore.

Step 6: Add at least one user as an admin and click restore. Now, the workspace will be restored.

The contents on the workspace also will be restored.

 

Recover Operation

To understand recover workspace, we need to know the different states of the workspace. Usually, we will be seeing the following states Active & Deleted.

Active – The workspace is live, and users can access the content from the workspace

Deleted – The workspace is deleted and can be restored if you want. Once restored then workspace will change to Active state.

Other than the above we have another state called “Orphaned”. If the workspace is changed to orphaned then you to recover workspace otherwise it will not visible to the users and can’t access that workspace.

Let me explain to you how to make the workspace orphaned but you don’t try at your end.

Whenever you create a workspace, it will consider the workspace creator as an admin of the workspace. It is always mandatory that at least 1 user should be admin of the workspace. You can check the access once the workspace is created.

Sometimes you might encounter the below error message. It means you or your admin team member removed the admin of that workspace.

If any workspace doesn’t have an admin then it will change to the orphaned state.

In Power BI, you can’t remove the admin from the workspace but there is an option to remove an admin user from the workspace,

 

I don’t know why the product has this option. As per my understanding, this should be fixed. Let me show how to remove the admin users as well. Please don’t try with your production workspaces.

 

Follow the below steps

Step 1: Log in to app.powerbi.com and head over to admin portal

Step 2: Check for workspaces section

Step 3: Check for the workspace on the list and check the State of the workspace.

Step 4: Choose the orphaned workspace

Step 5: Click recover on the top. Recover option will be available only for orphaned workspace

Step 6: You need an admin user to the workspace, and it will be recovered.

Thanks

Please send your feedback.

ByHariharan Rajendran

Microsoft Flow – Find Min or Max Value

This post explains the steps that you need to perform to find the minimum or maximum value from a list of values or a field from a table in Microsoft flow

I am explaining a simple procedure to achieve the result.

Procedure:

1.       Take the first number and store it on a variable

2.       Compare that variable to all the values from the field

3.       Check the Maximum (greater than) value – If else condition

Scenario:

I am taking google sheet which has the column called “Sales”. It has some value. I need to find the maximum value.

Follow the below steps in Microsoft flow

Step 1 – Use any trigger

Step 2 – Action – “Get Rows – Google Sheet” and select the appropriate sheet.

Step 3 – Action – “Select”

        From : “Record value”

        Map : “Sales” => Sales

Step 4- Action – “Initialize Variable”

        Name : “MaximumSales”

        Type: “Integer”

        Value: int(first(body(‘Select’))?[‘Sales’])

Step 5 – Action – “Apply to each “

         Select an output from previous steps: Records value

        Condition

                                                               i.      int(items(‘Apply_to_each’)?[‘Sales’]) “is greater than” “MaximumSales”

                                                             ii.      If Yes

        Set Variable

               Name: MaximumSales

               Value: int(items(‘Apply_to_each’)?[‘Sales’])

Step 6 – Action – “Compose”

       Inputs : MaximumSales

That’s it. we got the maximum value. Stay tuned for the use case with this logic.

ByHariharan Rajendran

SAMEPERIODLASTYEAR in SSRS Report builder

This article explains how to use SAMEPERIODLASTYEAR function in SSRS to achieve the same result as like in Power BI.

The challenging part is we don’t have SAMEPERIODLASTYEAR function in SSRS so we need to look for another function which should give the same result.

We have “Previous” function in SSRS, we are going to use that function to achieve the result. Let us get started.

First, let see the result of Power BI – SAMEPERIODLASTYEAR

SAMEPERIODLASTYEAR = CALCULATE(sum(DemoSales[Sales]),

SAMEPERIODLASTYEAR(DemoSales[SalesDate]))

Let us replicate the same result in SSRS.

First, understand the functionality of “Previous” function.

It will give the result of the previous row as the line below.

We need to tune the previous function to get the result as same as SAMEPERIODLASTYEAR in Power BI.

Follow the below steps.

  1. Create a Parent Group called Year using “Year” Column
  2. Create a Child Group called Month using “Month” Column
  3. Create a child group called “Day” using “Day” column.

  1. Change the previous function expression as like below.

=Previous(sum(Fields!Sales.Value,”Day”),”Year”)

 

  1. That’s it. We achieve the result as same as Power BI.

 

Warning:

The result of the previous function will work only on the current data displayed. For example, if you create a parameter for year and filter for the year 2019 then it will not show any value for previous value.

ByHariharan Rajendran

Power BI Embedded Service Principal – Postman

Microsoft Power BI team released a new update recently on Power BI Embedded to use service principal whenever it required. In this article, I am explaining that how to add service principal into Power BI service workspace which is not a direct way.

Service principal should be added into Power BI using REST API. Follow the below steps.

First create Power BI new workspace.

Add the service principal as a user with admin access. To perform this step, we need to use Power BI REST API.

I have used postman to complete this step.

First, I generated the authorization token. POST https://login.microsoftonline.com/common/oauth2/token?

Then added the service principal into the Power BI workspace.

 

ByHariharan Rajendran

Dataset Refresh in PowerApps Custom visual

As we know, Power BI can be integrated with Power BI. Check the below approaches.

  1. In Power Apps, we can use Power BI tiles
  2. In Power BI, we can use Power Apps custom visual.

 

This article explains the latest update to the Power Apps custom visual. Let us take a scenario.

I need to embed the Power Apps with in Power BI report. Whenever an item is added through PowerApps, I need to refresh the dataset used for the Power BI report.

For the above scenario, we need to use Microsoft Flow to achieve the result. In Microsoft flow, we need to have custom connector to refresh the Power BI dataset.

From the latest update from PowerApps custom visual, we no need to go to the above step. I mean, we no need to use Microsoft flow to refresh the dataset. We have refresh method added with Power BI integration data source itself.

It is a very good feature for Power BI + PowerApps scenarios.

ByHariharan Rajendran

DAX Group by and Filter

This post explains you that how DAX can be used for Group by and Filter logic. To explain this scenario, I am taking an example data.

Example Data:

I have a following table.

EmpID EmpDesignation EmpDepartment DesigFrom
1 A IT – Finance 01/02/2018
1 SA IT – Sales 05/08/2018
2 M Marketing 05/07/2018
2 SM HR 25/08/2018
3 A Sales 06/06/2018

 

Requirement

As per the requirement, I need to perform below actions.

  1. Date Slicer should be available to filter From and To date
  2. Based on the filter, a table of figure should display the latest designation.

Expected Result

Filter – Start Date: 01/02/2018 to 31/08/2018

EmpID EmpDesignation MostEffective
1 SA 05/08/2018
2 SM 25/08/2018
3 A 06/06/2018

 

To achieve the above result, we need to create a DAX script. Use the below DAX script

MosftEffectiveDate = CALCULATE(

MAX(Employee[DesigFrom]),

Filter(Employee, Employee[EmpDesignation] = EARLIER( Employee[EmpDesignation])))

 

Sometimes, you may receive error due to Earlier function. Check the another version of the script without Earlier.

MosftEffectiveDate =

VAR Desig = MAX(Employee[EmpDesignation])

RETURN

CALCULATE(

MAX(Employee[DesigFrom]),

Filter(Employee, Employee[EmpDesignation]=Desig))

 

ByHariharan Rajendran

Python Script to Event Hub – Power BI Sentiment Analysis ML Function

This article explains the python script to send the data to event hub.

As we know, event hub is azure service which is designed to capture the streaming events from different devices or applications.

The below python script sends the sentiment string to event hub.

import sys

import logging

import datetime

import time

import os

import itertools

from azure.eventhub import EventHubClient, Sender, EventData

 

logger = logging.getLogger(“azure”)

 

ADDRESS = “amqps://<eventhubworkspacename>.servicebus.windows.net/<eventhubname>”

 

# SAS policy and key are not required if they are encoded in the URL

USER = “RootManageSharedAccessKey”

KEY = “<Key>”

 

try:

    if not ADDRESS:

        raise ValueError(“No EventHubs URL supplied.”)

 

    # Create Event Hubs client

    client = EventHubClient(ADDRESS, debug=False, username=USER, password=KEY)

    sender = client.add_sender(partition=”0″)

    client.run()

   

    try:

        start_time = time.time()

        for i in range(1):

            print(“Sending message: {}”.format(i))

            a=”[{“+”ts”+”:”+str(1550321522)+”,”+”Message”+”:”+”\”good\””+”,”+”TestVal”+”:”+str(7)+”}]”

           

       

            print(a)

            sender.send(EventData((a)))

                     

           

    except:

        raise

    finally:

        end_time = time.time()

        client.stop()

        run_time = end_time – start_time

        logger.info(“Runtime: {} seconds”.format(run_time))

 

except KeyboardInterrupt:

    pass

 

 

Applications

The above script sends the data to event hub which can use used as an input to the AZ ML function in Stream Analytics.

The output of the stream analytics query can be used for Power BI streaming dataset.

 

ByDr. SubraMANI Paramasivam

Microsoft Flow with Face API

This article explains how to handle the azure cognitive service APIs within Microsoft Flow. Microsoft Flow team has released new connectors for Azure cognitive service API which are in preview now. It includes Computer Vision and Face API.

Each connector has a different set of actions. We can use those actions by passing the proper input to the connections.

To make it clear, I am explaining a scenario with Face API in Microsoft Flow. In this, I will explain how you can process the “Detect Faces “action and store the result on on-premises SQL Server table.

Requirements

  1. Face API URL & Key
  2. On-Premises Data Gateway – SQL Server
  3. Microsoft Flow – Free subscription or O365 subscription

Creating Face API\

To create a Face API, you need an Azure Subscription. If you don’t have a subscription, then you can get a free Azure subscription from here.

Visit portal.azure.com and click “Create a Resource”.

Under new, choose “Ai + Machine Learning” -> Face

Create a new face resource by providing the required details.

Once the resource is created, you need to get the key and URL (EndPoint).

Note down the endpoint and key and we will use it on Microsoft Flow.

On-Premises Data Gateway

As you know, Power BI can connect with on-premises data using on-premises data gateway. This gateway is not only for Power BI, it also for Logic Apps, Azure Analysis Services, Microsoft Flow and Power Apps. You can use the same data gateway to connect with on-premises data within Microsoft Flow.

On-premises SQL Server

You need to create two tables for this scenario.

Table 1 – It should hold the Image Path column. Example – https://www.sitename.com/image1.jpg

Table 2 – To store the API result. Use the below structure.

CREATE TABLE [dbo].[APIFaces](

       [id] [INT] IDENTITY(1,1) NOT NULL,

       [ImagePath] [NVARCHAR](MAX) NULL,

       [Gender] [NCHAR](10) NULL,

       [Glasses] [NVARCHAR](50) NULL,

       [Smile] [FLOAT] NULL,

 CONSTRAINT [PK_APIFaces] PRIMARY KEY CLUSTERED

(

       [id] ASC

) ON [PRIMARY]

)

GO

Microsoft Flow

You can create a free account on Microsoft Flow or if you have 0365 subscriptions then you will get flow by default as one of the features.

You can learn more about Microsoft flow here.

Follow the below steps.

As I mentioned, we are going to use SQL Server with Face API.

To create any flow, we need to set a trigger section. Here, I am using SQL Server as a trigger. SQL Server has 2 different trigger options, in that, I am using a trigger called “When an item is created”

Once added that trigger, you need to create and map the connection. When you click the “…” option on the right corner, you will get the form to fill the details to create a connection with your on-premises SQL Server.

Fill the required details and make sure the connection is created successfully.

If the connection is created successfully then you can see the tables list as like below otherwise you will get an error message.

The next step, add the Face API and choose the “Detect Face” action.

There also you need to create a connection with face API key and URL. You can provide any name to the connection name field.

Face API will ask you to provide the image URL.

You can easily choose the ImagePath from the dynamic content.

Next, add SQL Server and choose “Insert Row” action.

This time, you can use the same connection which you created above.

Select the table name. It will load the columns from the table. You need to map the dynamic content on each field.

Once all the fields are mapped then you can see the flow as same as like below. Sometimes, Apply to each condition will be added automatically.

The final flow would look like below. You save and test the flow.

You can check the flow history for flow status and check the result on SQL Server table.