Power BI Governance & Administration: Keys to a Successful Rollout

Data governance and administration are essential when implementing Power BI. These two critical functions ensure data consistency, security and accessibility. They enable fast, reliable reporting, support data collaboration and sharing, and minimize risks associated with data misuse. 

Many organizations adopt Power BI without a proper data governance or admin strategy in place. Only later do they discover that their reporting lacks control, which in turn leads to a lack of a definitive source of truth for critical business decisions.  

Watch our on-demand webinar to learn more about advanced Power BI settings, controls and methodologies that will help ensure the success of your Power BI rollout. We discuss these important topics:   

  • Role delegation 
  • Workspace administration 
  • Data governance 
  • Content administration 
  • Internal/external sharing 

Put the right controls in place BEFORE you implement Power BI and your roll out will be red carpet worthy. 

Presenters

Todd Schuman
Installation, Upgrade and Performance Tuning Practice Lead
Senturus, Inc.  

Todd has over 20 years of business analytics experience across multiple industries. His expertise lies in helping customers architect analytic environments that are high performing yet simple to and easy to use. 

Steve Nahrup
Sr. Microsoft Solutions Architect
Senturus, Inc.

Fluent in the Microsoft stack, Steve has more than a decade of experience overseeing the architecture and design of data warehouse, BI, reporting and dashboard solutions. Curious about new technologies, he is constantly downloading free trials of new platforms and arranging meetings with their product teams to discuss an ongoing relationship where he is granted a free license in return for ongoing feedback and thoughts on the current state and future releases.

Questions log

Q: Do the same or similar administrative roles exist when using Power BI Report Server?
A: Power BI Report Server administrative capabilities:

  1. Manages on-premises server infrastructure and security: This includes server installation, configuration and ensuring the security of the on-premises environment.
  2. Handles on-premises data source management: Administrators control and maintain data sources located within the organization’s on-premises network.
  3. Publishes, organizes and maintains on-premises reports: Report authors publish and manage reports exclusively within the on-premises server.
  4. Controls on-premises data access and security: Security administrators define and enforce access controls for reports and data stored locally.

Power BI cloud service (Power BI service) administrative capabilities:

  1.  Manages cloud-based environment, user access and licensing: Service administrators oversee the entire cloud-based Power BI environment, including user management and licensing.
  2. Administers specific cloud workspaces: Workspace administrators have control over individual workspaces within the cloud service, managing content and permissions.
  3. Creates, publishes and maintains cloud-based reports and dashboards: Content creators design, publish and update reports and dashboards directly in the cloud service.
  4. Configures role-level security and data source access in the cloud: Security administrators set up role-level security and manage access to cloud-hosted datasets and reports.

Power BI Report Server handles on-premises server and data management while Power BI Cloud service offers cloud-based administrative capabilities for content creation, security and collaboration.

Q: Can we take advantage of pipelines, if we do the majority of our work in Power BI Desktop?
A: Yes. The pipelines should be the only workspace that is ever published from Desktop directly. The permission to publish directly should be disabled for the test and production workspaces.

Q: What is the best way to set up dev, QA and prod (which are three workspaces or three separate tenants) in Power BI?
A: Setting up environments for development (dev), quality assurance (QA) and production (prod) in Power BI is a common practice to ensure a smooth development and deployment process. Both the three workspaces and the three separate tenants approaches have their own advantages and disadvantages. Here’s a comparison to help you decide:

  1. Three Workspaces (dev, QA, prod):

Advantages:

  1. Simplicity: All work is done within the same tenant, making it easier to manage and navigate.
  2. Cost-effective: No need for multiple Power BI Pro licenses across different tenants. 
  3. Easy promotion: Moving content from one workspace to another can be done quickly. 
  4. Unified security: All security settings and user permissions are managed within a single tenant. 

Disadvantages: 

  • Risk of overwrite: There’s a risk of accidentally publishing a report to the wrong workspace. 
  • Shared capacity: If you’re using Power BI Premium, all workspaces will share the same capacity.  

Q: How does Power BI secure the database access by using dataflows as opposed to direct access to create datasets?
A: 1. Power BI dataflows:
Dataflows are a form of ETL within the Power BI service that allows you to connect to, transform and load data into the Power BI environment.

    • Managed environment: Dataflows are executed within the managed Power BI service environment. This means Microsoft takes care of the underlying infrastructure, ensuring it’s secure and compliant.
    • Stored in Power BI: After the data is ingested using dataflows, it’s stored in Azure Data Lake Storage Gen2, which provides enterprise-level security features, including encryption at rest.
    • Access control: With dataflows, you can set up more granular access controls. You can determine who can create, edit or view a dataflow, independent of the datasets that are created from that dataflow.
    • Scheduled refresh: Dataflows support scheduled refreshes. This means you can set them up to pull in new data at specific intervals, reducing the need for frequent direct connections to the source systems.
    • Gateway: If you’re connecting to on-premises data sources, you’d use the on-premises data gateway. This gateway ensures encrypted and secure data transfer from on-premises sources to Power BI.
  1. Direct Access to create datasets:
    When you directly connect to a data source to create a dataset, you’re often connecting in real-time or near-real-time to the data source.

    • Live connections: Some sources, like SQL Server Analysis Services support live connections. This means that the data stays in the original source and Power BI queries it in real-time. This can be secure if your source system has robust security mechanisms in place.
    • Direct Query: For some relational databases, Power BI supports DirectQuery mode. In this mode, data isn’t imported into Power BI; instead, Power BI sends queries to the data source when a report is viewed.
    • Gateway: Just like with dataflows, if you’re connecting to on-premises data sources directly, you’d use the on-premises data gateway to ensure encrypted and secure data transfer.
    • Stored credentials: For scheduled refresh scenarios, Power BI needs to store credentials to access the data source. These credentials are encrypted and stored securely.
  2. Comparison:
    • Performance: Dataflows might offer better performance for large datasets since the data is pre-processed and stored in an optimized format in Azure Data Lake Storage Gen2.
    • Flexibility: Direct access might be more suitable for real-time analytics needs or when the data source’s native security and business logic (like in SQL Server Analysis Services) need to be used directly.
    • Security granularity: Dataflows offer an additional layer of security granularity since you can control access to the dataflow separately from the datasets and reports that derive from it.

Both methods have their strengths. The choice of which you use often depends on the needs of the project. It’s also worth noting that security in Power BI isn’t just about how data is ingested; it also involves features like row-level security, auditing and compliance tools that Microsoft provides within the Power BI service. 

Q: How do we refresh the Power BI dataset based on database trigger?
A: Refreshing a Power BI dataset based on a database trigger isn’t supported out of the box. However, you can achieve this by combining several components and services. Here’s a high-level process description using Azure services:

Steps:

  1. Database trigger:
    • Have a trigger set up in your database (like SQL Azure) that detects changes (inserts, updates, deletes, etc.).
    • When the trigger fires, it can write a message to an Azure Service Bus or insert a record in an Azure Queue storage.
  2. Azure Functions:
    • Set up Azure Functions so you can get triggered based on the message in the Azure Service Bus or the Azure Queue storage.
    • Azure Functions call the Power BI API to initiate a dataset refresh.
  3. Power BI API:
    • To programmatically refresh a dataset in Power BI, you’ll use “Refresh Dataset” endpoint in the the Power BI REST API.
    • You’ll need to have an access token (usually obtained via Azure AD) to authenticate and make calls to the Power BI API.
  4. Power BI service:
    • After the API call is received, the Power BI service will initiate the dataset refresh as requested.

Detailed Steps:

  1. Database trigger:
    • In your SQL database, create a trigger that responds to data changes.
    • Use this trigger to send a message to Azure Service Bus or insert into Azure Queue storage.
  2. Azure Service Bus or Azure Queue storage:
    • Set up either service based on your preference. This will serve as the intermediary to notify the Azure Functions.
  3. Azure Functions:
    • Create a new function that gets triggered by the Azure Service Bus or Azure Queue storage.
    • In this function, write code to call the Power BI API. You’ll need the datasetId of your Power BI dataset and an access token for authentication.

Here’s a pseudo-code for Azure Functions:

def azure_function_triggered_by_service_bus(): 

    dataset_id = "YOUR_DATASET_ID" 

    access_token = get_power_bi_access_token()  # Implement this function to get token from Azure AD 

      headers = { 

        "Authorization": f"Bearer {access_token}", 

        "Content-Type": "application/json" 

    } 

        url = f"https://api.powerbi.com/v1.0/myorg/datasets/{dataset_id}/refreshes" 

    response = requests.post(url, headers=headers) 

        if response.status_code == 202: 

        print("Refresh initiated successfully.") 

    else: 

        print("Failed to initiate refresh:", response.text) 
  1. Power BI service:
    • Ensure that your dataset in the Power BI service is set up correctly to allow API-based refreshes.
    • Monitor refreshes in the Power BI service to see if they’re being triggered as expected.

Remember to handle errors and exceptions properly in your Azure Functions to account for any issues that might arise during the process. 

This setup allows for a near-real-time refresh of Power BI datasets based on changes in the database. It’s essential to be mindful of the refresh limits imposed by Power BI service, especially if you’re using Power BI Pro or shared capacities in Power BI Premium. 

Q: You mentioned using certified Power BI datasets. It looks like the only way to allow users to use a dataset to develop a report in their workspace is to make them a member of the workspace that contains the dataset. How do we keep them from editing content in that workspace?
A: If you want users to have the freedom to create those reports along with the cross-workspace data sets, you may need to make them an admin to really work with those datasets as needed. It’s important to map out the roles, access and sensitivity labels so you can set up row-level security for them if necessary. There are a lot of access functions you can set up to make sure users are only seeing what they should be. But in terms of the role, there isn’t a specific role like member, contributor or admin that will give you exactly what you’re asking for. So you need to work through what people should have access to based on what you want them to do. Then you need to tweak all of the row-level security and object-level security audiences and workspace level permissions so they can only see what they are authorized to. While it is possible, you need to clarify and understand what your requirements are for each user. 

Q: Here is our scenario. If we create a Power BI dataset to build a new report and publish the report to a new workspace and then an app is created using the new report and dataset and the app is shared with a user, will the user need to have access to the original dataset to view the content in the app?
A: No, the user does not need direct access to the original dataset to view the content in the app. When you create an app in Power BI and share it with users, those users get access to the reports, dashboards and underlying datasets that are part of the app. This is one of the primary advantages of using apps in Power BI, to package content and share it without requiring users to have access to the original workspace or individual datasets.

Here’s a breakdown:

  1. Workspace: This is where you, as a content creator or developer, create, modify and manage Power BI content. Only members of the workspace need direct access to its content, including datasets.
  2. App: When you publish an app from a workspace, you’re packaging up the content (reports, dashboards, datasets) into a shareable unit. This app can then be shared with end-users.
  3. End users: When you share the app with end users, they can install and view the app’s content without needing access to the original workspace or any of its datasets. They will see the data in the reports and dashboards, but they won’t be able to modify the original content or see the workspace where the content was developed.
  4. Permissions: The permissions for viewing data and interacting with reports/dashboards in the app are based on the app’s settings and row-level security you’ve set up on the dataset. End users don’t need direct permissions on the original dataset; they inherit the necessary permissions through the app.

After you share an app with a user, they can view its content without any additional permissions on the original dataset or workspace. 

Q: Can we use security to allow Power BI users to open in Excel with Open in Desktop App yet still enforce not saving copies of the sheets locally? This would be to satisfy traditional local Excel users but encourage central storage and sharing of worksheets.
A: Power BI integrates with Excel through the Analyze in Excel and Export features. If you want to allow users to interact with Power BI datasets using Excel but prevent them from saving the data locally, you will need to consider a combination of Power BI and Excel/Office 365 settings. 

Here’s a breakdown of what can and can’t be done: 

  1. Power BI settings:
    • Analyze in Excel: When enabled, this feature allows users to create pivot tables and reports in Excel that directly connect to the Power BI dataset. The data remains in the Power BI service and Excel pulls the data in real-time.
    • Export data: You can restrict users from exporting data from Power BI. This will prevent users from exporting summarized data or detailed data from visuals.
  2. Excel/Office 365 settings:
    • Information Rights Management (IRM): With IRM, you can restrict permissions to Excel files. You can allow users to view files but not print, forward or save them. However, this requires Azure Rights Management (part of Azure Information Protection) to be set up and integrated with Office 365.
    • Data Loss Prevention (DLP): Office 365 has DLP policies that can prevent sensitive data from being saved to unauthorized locations. This can be useful if you have data classifications that shouldn’t be saved outside of approved locations.
  3. Limitations:
    • If users can interact with data in Excel, they can still potentially copy data manually, even if you prevent them from saving the Excel file.
    • Restricting users from saving can be cumbersome and might affect the user experience. Ensure that users understand why such restrictions are in place.
  4. User training and culture:
    • Technical solutions are just one part of the puzzle. It’s equally important to train users and cultivate a culture of central storage and sharing. Explain the benefits of centralized data management, the risks of local data copies (like outdated data or data breaches) and encourage best practices.

While you can use a combination of Power BI and Office 365 settings to restrict users from saving Excel files locally, it’s not foolproof. It’s essential to complement technical measures with user training and awareness to achieve the desired outcome.

Q: Can we nest Power BI workspaces inside one another to help with organization or is that on the roadmap for Microsoft to include that functionality?
A: There’s nothing on Microsoft’s roadmap about nesting workspaces right now.
But if you have premium per user or premium capacity licenses, you are able to set up that nesting functionality and cross-workspace utilization of data sets outside of one single workspace. However, if you’re thinking about rearchitecting an existing workspace or infrastructure, that’s something that we can have a separate conversation about, contact us. 

Q: How can I see usage auditing data for my entire tenant?
A: Go into the admin portal of your capacity/tenant, there’s an embedded dashboard that shows you metrics of utilization and the metadata being used at the user level. Using them in combination with individual user data will give you good insight into which resources are being used, which are being overextended and which ones are bottlenecked. Microsoft is working on an updated version of the embedded capacity/tenant level utilization report. 

Q: I want to create a Power BI certified dataset that provides access to a data mart that has no sensitive data. I want any user to be able to create their own report. I don’t want any user to be able to edit content in the workspace that contains the dataset. How is this done?
A: While this is a little tricky, this can certainly be accomplished by following these step-by-step instructions:

  1. Create the dataset:
    1. In your Power BI Desktop, connect to your data mart and build your dataset.
    2. Publish the dataset to a dedicated workspace in Power BI service. This workspace will hold the certified dataset.
  2. Certify the dataset:
    Before a dataset can be certified, it needs to be promoted. Only datasets in workspaces in Premium capacity or dedicated cloud capacity can be certified.

    1. In the Power BI service, go to the workspace where you published the dataset.
    2. Click more options (the three dots) next to the dataset, then choose Settings.
    3. In the Endorsement section, first promote the dataset. After promotion, you’ll have the option to certify the dataset. (Only Power BI admins can certify datasets.)
  3. Adjust Workspace permissions:
    To ensure that users can’t edit content in the workspace but can use the dataset, you’ll set permissions accordingly:

    1. In the Power BI service, navigate to the workspace containing your dataset.
    2. Click Access in the workspace menu.
    3. Add users or security groups and assign them the Viewer role. The Viewer role allows users to see content in the workspace but not edit or add anything. They can also connect to and create reports from datasets in the workspace.
  4. Allow users to create their own reports:
    1. After the dataset is certified, users across the organization can discover it in the Datasets hub in Power BI service.
    2. Users can connect to this dataset and create their own reports either directly in Power BI service by choosing Get Data > Power BI datasets or in Power BI Desktop by selecting Get Data > Power Platform > Power BI datasets.
    3. When users create their own reports using this dataset, they’ll be doing so in their My Workspace or any other workspace they have permissions to edit. They won’t be creating or modifying content in the workspace that houses the certified dataset.

By following these steps, you’ll have a certified dataset in a controlled workspace, while still allowing users the flexibility to create their own reports using that dataset.

Machine transcript

0:11
Hello everyone and welcome to today’s SENTURUS webinar on Power BI administration and data governance.

0:18
Keys to a successful rollout, quick bid on our agenda for today, we’ll do some introductions of today’s presenters, talk a bit about roll delegation in Power BI, workspace, administration, data governance, content administration issues or not issues, but mechanisms for internal and external sharing.

0:40
We’ll do a quick overview of Senturus as a company and some additional resources we have to offer.

0:45
And as I said, we’ll do some Q&A at the end.

0:50
Our presenter is for today and 1st up we have Todd Schuman.

0:54
Todd is a BI Solutions Architect here at Senturus.

0:58
Todd has over 21 years of business analytics experience across multiple industries.

1:03
He is a Microsoft certified Power BI Data Analyst and Todd’s expertise lies in helping customers design analytic environments that are high performing but also simple and easy to use.

1:16
Todd lives with his wife and his two daughters outside of Washington DC and Virginia and we also have Steve Nahrup here.

1:23
Steve is our practice lead for the Microsoft Fabric practice here at Senturus.

1:29
Steve is a Microsoft Solutions architect fluent in the entire Microsoft stack.

1:34
Steve has more than a decade of experience overseeing architecture and design of data warehouse, BI reporting and dashboard solutions.

1:43
As for me, I’m Steve Reed Pittman, Director of Enterprise Architecture and Engineering.

1:49
As usual, I’m here to do the intros and kind of keep things moving along.

1:53
But for the most part, you’re going to be hearing from Todd and Steve today.

1:56
Todd will be doing the book of the presentation and Steve will drop in for a few of the slides here and there.

2:02
So you’ll be hearing from both of them.

2:05
Before we get into the meet of today’s session, we’re going to run a quick poll.

2:09
Let me go ahead and start this up.

2:12
My poll window disappeared on me.

2:15
So one moment.

2:16
Here we go.

2:17
All right, so today’s poll is how confident are you that your company has properly secured governance across your current Power BI environment?

2:26
Instead, you feel out your Power BI environment is currently very secure, moderately secure, fairly unsecure or completely unsecure.

2:36
So the answers are coming in.

2:39
It looks like the bulk of you right now are coming in kind of around the moderately secure level, but there’s a little bit of everything.

2:47
So clearly there is a lot of variation out there and that isn’t uncommon there.

2:54
There’s a lot to look at and a lot to consider when you’re securing your environment and trying to implement to get governance.

3:03
So again, it looks like about 2/3 of you feel like your environments are moderately secure and I’m a handful of you have very well secured environments.

3:14
Some of you think completely insecure and if you have you also down that fairly insecure zone.

3:21
So I’m going to go ahead and stop sharing these.

3:23
And with that, I’m going to be quiet and hand it over to Todd.

3:29
Right.

3:29
Thank you, Steve.

3:30
I did want to go back to the intro slide for a second.

3:32
And I apologize for not being named Steve and having a beard slash goatee.

3:37
I think this could have been the greatest webinar ever if we were all named Steve and had similar facial hair.

3:42
So total missed opportunity.

3:44
But maybe next time.

3:46
Anyway, time to get serious.

3:48
Jumping into the meet today, they want to touch on licensing.

3:54
It’s a whole other beast.

3:55
I’m not going to get into all the specifics today.

3:58
It’s also constantly changing with some of the fabric announcements that have been woven into the discussion.

4:06
These are some links here that are two different areas of the Microsoft website where you can kind of get the latest information directly from the horse’s mouth.

4:13
If you need any additional help or have questions, be sure to reach out to Kay in the chat and we could set up a call to discuss.

4:20
I’m going to get into some features today and capabilities that do require specific license types.

4:25
So I’m going to try to call those out just so you’re aware of them.

4:28
But there is a lot of information and different flavors of the pricing and licensing out there.

4:34
So use these links and hopefully you can kind of get a little more insight to those administrator types.

4:42
There are essentially 3 main ones that you need to be aware of.

4:45
There is the Microsoft 365 Global Administrator.

4:49
This is the very top level.

4:51
It allows you to create users, security groups, assign additional licenses.

4:56
It also gives you full control over the lesser platforms like Power Platform and Power BI.

5:02
The next level below that is the Power Platform Administrator.

5:05
This gives you all the admin capabilities around Dynamics 365, Powerapps, Microsoft Flow, as well as full control over Power BI.

5:14
And then the bottom one which we’re going to be mostly focused on today is the Power BI administrator, which is now known as Fabric Administrator.

5:22
This is going to give you access to Power BI Desktop, the Power BI service.

5:27
Power BI Mobile allow you to set up tenants, but it does not give you those upper levels like Power Platform, Office 365 and again just a little asterisk here.

5:40
Fabric is changing things almost at a daily level.

5:43
It’s lots of constant updates to these capabilities so as today this is currently what it is, but I would recommend you know bookmarking some so those sites on the prior slide just so you can be aware of any constant changes that are happening.

5:57
Admin groups are all configured through the Office 65 through 65 console.

6:02
To do so, you could just go into your users, select Manage roles, and you are presented with a big old list of types of administrators.

6:09
I kind of highlighted here the three I’m talking about.

6:12
So you can see it’s now called Fabric Administrator, used to be called Power BI.

6:16
There’s also the Global Administrator and the Power Platform Administrator.

6:20
So depending on how you want to allocate those groups, it’s all kind of done through this, this screen here.

6:29
And then one of the most common mistakes people make when rolling out Power BI is just to kind of open things up.

6:33
Let users jump in before they figure out the security and the user groups.

6:38
And this is a big mistake.

6:40
It’s very important to determine the admin groups and your user groups before you roll things out.

6:46
You need to define these global administrators, your Power Platform administrators, your fabric administrators, as well as spending time creating things now like your workspace administrators, your data stewards, your testers, developers, and consumers by various business functions.

7:01
I’m going to touch on a bunch of these user type groups in a couple of slides, but I just wanted to say, getting ahead of this and spending some time up front to figure out how you want to secure and roll out Power BI is going to make a huge difference when you let users in, and if you have this already set up in place, it’s going to make your rollout much more successful.

7:21
Outside of manually granting groups and user access, there’s something called Privileged Identity Management.

7:27
This is a time based, approval based, role activated tool that allows you to grant temporary access to workspaces and data sets.

7:36
It really helps reduce the risk for excessive or unnecessary misused access.

7:40
It allows you to define start and end dates so you don’t come back later and remove you don’t forget to come back later and remove somebody who should have been removed as an administrator.

7:49
So again, this is a newer feature.

7:51
It is limited by a few prerequisites.

7:55
Specifically, you need a Power BI Pro license and you have to have an Entra ID P2 if you have a Microsoft 365 E 5 subscription.

8:04
The good news is these are both included so you’ll be able to do this right away.

8:09
And just for reference, Microsoft Entra ID used to be known as Azure ID Azure AD.

8:16
For some reason, Microsoft decided to make things even more complicated than they normally are by now renaming Azure AD to Entra ID.

8:23
So if you see that you know going forward, just know that it’s essentially just Azure AD renamed.

8:30
But if you do have those two capabilities, or there’s the E5 license overall, you’ll be able to use these privilege identity management capabilities which are really nice.

8:40
Going to jump into some workspace administration.

8:48
Power BI workspaces typically contain various objects like dashboards, reports, excels, cell workbooks, data sets, data flows.

8:56
You probably have some experience with these currently.

9:00
Just to clarify a couple things, this is a little bit different than the my workspace area and I refer to My workspace.

9:06
I’m not referring to the crusty old desk.

9:08
I said it every day is surrounded by old coffee cups and empty bags of funions.

9:13
I’m talking about a folder in the Power BI service that is visible to just myself.

9:18
There’s no outside access, just I’m the only user who can kind of see in here.

9:23
So there’s no collaboration.

9:24
It’s not something you really need to worry about securing.

9:26
It’s already kind of secured just for you and you alone.

9:31
Outside of that, we have public workspaces that are made to share your dashboards, workbooks, data sources, etc.

9:37
It doesn’t mean that you should open these up for users to go wild.

9:40
Workspace proliferation is the number one issue that we run into on failed Power BI rollouts.

9:46
By not locking down the ability to create new workspace, you’re letting users create duplicates and conflicting information that’s extremely difficult to undo once it’s begun.

9:56
Even if you are OK now with your rollout, you don’t put some restrictions in place today.

10:00
Things are going to get out of hand before you know it.

10:03
The good news is that it’s very easy to prevent this with just a couple clicks. In the Power BI service.

10:11
As an administrator, you have access to the admin portal and there’s an area on your tenant settings around workspace settings.

10:18
What you want to do is you want to prevent the entire organization from being able to create these workspaces, and instead you want to delegate these to a specific group I mentioned.

10:28
You know, earlier on the slide about creating groups and roles?

10:30
Something called like a workspace administrator would be a good idea, obviously.

10:34
Call whatever you want.

10:35
So a group of people who are responsible for went deciding when and where to create additional workspaces versus just letting people go and put out, you know, Todd’s reports, Bob’s reports, Steve’s reports, and just having all these folders and workspaces floating around with duplicate dashboards, data sets, etc.

10:53
You want to make sure that not happening.

10:55
So by creating this workspace administrative group and granting them the only ones who have the ability to do this, you’re going to prevent that from happening.

11:04
Steve is going to go over some workspace access and permissions issues, so I’m going to turn it over him for this slide.

11:12
Hey guys, so when it comes to creating access or allowing access to end users for reports, we’re going to get into this in a little bit.

11:25
But they’ve really made it so that you don’t have to provide every single user direct access to the workspace and get into apps very soon.

11:37
But like Todd mentioned before, you really don’t want to set yourself, put yourself in a position where you have to retroactively do things.

11:45
And so when it comes to assigning these roles, Contributor, Member, Admin, instead of doing it at a user level, we highly, highly recommend that you create Azure Security Groups and then you assign each user to a specific role/security group.

12:09
And then you are able to go in and just one time set up the user permissions in the workspace or the app like we’ll see in a second.

12:20
But Admins should be very rare.

12:25
Those are the only people that should be actually creating, publishing and, you know, deleting reports and then also members.

12:37
Those are ones that should be able to see everything in the workspace and contributors.

12:43
I recommend really not giving them direct access to the workspace, but to the workspace app.

12:48
So definitely the contributor is the most prevalent or common members second most.

12:55
And then the admin is again the rarest permission that you should provide.

13:04
Thanks Steve.

13:06
Up next is deployment pipelines.

13:08
So if you are old like me, you probably at some point had a BI tool that lived on premise somewhere and you had three or more environments dev, you know, test/QA, production, maybe additional ones.

13:23
But since we don’t have separate environments in Power BI SAS, we can simulate that same approach using workspaces that are set up and secured for various business units and to have a dev, test and prod area workspace for each.

13:38
You can manually create these workspaces an administrator and move content around as needed and secure them, you know 1 by 1.

13:46
Or if you have a Pro license and a PPU or a premium embedded capacity, you can use this deployment pipelines capability.

13:55
And these are really handy.

13:56
I think you can define parameters in your content to take advantage of the different servers, database names, schemas, etc.

14:04
And then map them to the specific deployment pipeline area.

14:09
So if you have a data source that points to SQL Dev, SQL QA and SQL Prod, you can have a parameter in your content.

14:17
You define that in the settings of these and once you define them, you can just click a button and say deploy a test, deploy to production and then update your apps.

14:25
And it makes it very seamless and easy to kind of move stuff between those different areas.

14:30
You also can define the user groups.

14:31
So I mentioned again earlier, you know having like a tester group, a developer group and then you’re obviously you’re in your business user groups assigned into the appropriate buckets here and spend the time in front to dedicate these to the groups and everything.

14:44
And then once it’s all set up, it’s super easy to just click a button and deploy things when they’re ready.

14:48
So very cool feature.

14:52
It is again tied to a couple elevated license capability type things.

14:57
But if you can, if you can get this or even just for a user or two who’s going to do your administration, I think it’s worth the cost.

15:06
One thing to note really quick about that one is because you have three different workspaces, you shouldn’t necessarily feel obligated to grant everybody access to all three.

15:17
More times than not, it’s a handful of people that have direct access to test and dev and then the end business users should only have access to production most likely.

15:29
So that’s it.

15:31
Yeah, exactly.

15:32
Thank you.

15:34
I’m going to jump into some data governance, talk about some self-service.

15:38
So we can’t have self-service without data governance.

15:42
And this has never been more important today as more and more people are moving away from traditional IT based controls to self-service.

15:50
So how do we do this?

15:52
We have to push users to a single centralized data set that has been certified by your data steward group as a trusted, single version of the truth.

16:03
And to help support this early on, we can also pull in and upload Excel files and centralize them in Power BI to get the Excel community to help buy in.

16:14
The single version of the Truth is a term that’s been around for years.

16:18
In the past, you know, IT would publish out models and then users would build reports and dashboards against them and they typically had a high level of confidence in the data they were using.

16:30
The drawback to this was, you know, it was a slow process.

16:33
It usually required lots of back and forth between the business users and the IT.

16:38
Sometimes those, you know, details were lost in translation and it was just slow to kind of get those updates as you needed them.

16:46
Fast forward to today, users can now correct connect directly to databases.

16:50
They can create their own calculations and filters, publish them right out to Power BI service.

16:56
The downside to this is that not all users really understand or grasp all the complexities of the database or best modeling practices and this can lead to incorrect data and incorrect calculations as well as duplication as all users are kind of just going on their own and just kind of using their own data sets.

17:16
So somehow we have to find a common ground.

17:19
And the good news is that with Power BI it does give us controls to do so with a few clicks.

17:24
The first thing we want to turn on is the sharing of datasets across workspaces.

17:29
If you aren’t sharing your workspaces it’s going to lead to having multiple copies of the same dataset spread across multiple workspaces and users are going to be constantly publishing new datasets because they can’t find one, find one that was already available because the sharing wasn’t there.

17:45
And by sharing a dataset that is promoted and or certified across workspaces, you just can then connect directly to that in Power BI Desktop or in the Power BI service and run and create content right off of that.

17:58
It helps eliminate the duplication that would normally occur and it also removes the need to allow users to directly connect to databases as the data sets are already exposed in the fields and calculations they need.

18:07
And then finally, the other benefit would be you know that you can schedule your data refreshes directly on the server and that can help keep your data fresh or as fresh as you need it to be using scheduling.

18:19
I mentioned endorsements earlier and there are two endorsement statuses to be aware of.

18:24
The first is promoted as a data set owner.

18:27
You can promote a data set within your organization.

18:30
This allows other users to see that there’s a new data set available.

18:34
A quick heads up on that is just because data set is promoted doesn’t necessarily mean it’s fully trusted or vetted.

18:40
So you need to be careful with that and to kind of have that next level of trust is called certified.

18:49
Certifying A data set is something that should be assigned to a custom security group as well.

18:53
There are settings in the admin console to do this.

18:57
So don’t let your you know user base go and be able to certify things on their own.

19:03
You want to have a group like a data stewards group.

19:06
You want to make sure that your users are not able to do this as the certified tag is going to lose credibility if anybody can just go and certify something.

19:13
So a trusted group of data stewards or whatever you want to call it, people who really know the data, who have spent some time testing the numbers, validating that the calculations everything look correct is going to be the best option here.

19:25
And then lastly, one other benefit of endorsements is that by certifying it, it’s going to cause it to bubble to the top of the list.

19:32
So when you just say connect to, you know in this screenshot here, get data Power BI data sets by having it certified it’s going to bubble at the top of the list when they search for stuff.

19:44
So it’s very easy to find them and just for other additional you know promoted gets put underneath them as well.

19:51
So certified first, then promoted, then everything else.

19:54
So these two tags can help, you know, help user identify what they want to connect to and use for building reports and hopefully eliminate the need for them to go out and do it on their own or kind of recreate something that’s already been done.

20:10
Good old Excel BI tools have been trying to replace Excel for years, but it’s still here and it’s not going anywhere.

20:19
And like most organizations, especially in the beginning of a new BI tool rollout, there’s going to be a user base that’s still heavily reliant on Excel.

20:29
So the good news is that Microsoft makes both these tools and they do work well together.

20:33
Excel workbooks can be directly uploaded into Power BI Service, and you can then use the Analyze and Excel features, which help bring us back to this centralized approach where everyone is using the same workbooks collaboratively and people are getting away from these local files that live on their desktops and they’re just kind of going rogue and maintaining their own, their own spreadsheets.

20:54
So we want to kind of get to the centralized concept here.

20:58
I would recommend as far as this setting as to make this sort of Excel process a one way street.

21:04
you don’t want users to upload Excel and do some work and then export it back out, because then you’re leading to more duplication.

21:11
You’re again no longer In Sync with this sort of centralized view.

21:15
So obviously special circumstances may apply to do the alternative that people, you know, export stuff out of here.

21:22
But as a general of thumb, I prefer as far as all security is to kind of lock things down maybe more than you would like, and then poke holes and make small exceptions for specific rules that need to kind of break that overall general rule.

21:36
But it’s always easier to kind of lock something out and let something out slowly than the alternative.

21:44
And then the last sort of topic on this, this area is data flows.

21:48
These can also be centralized and published to the Power BI service.

21:52
There’s several advantages to doing so.

21:54
Number one, you know you have the ability to publish out the power query and transformations to the workspaces and this allows it to be collaborative so the users can go in who have access to it.

22:06
And if there needs to be changes to the transformations of the data, it doesn’t have to just be you in your local version on your desktop.

22:13
Other people can do it as well, assuming they know what they’re doing.

22:16
So it’s a much better way to kind of manage that.

22:19
The nice advantage to this is to have a lower level of control on the data that’s being refreshed.

22:25
So for example, if you have in your data set a product dimension, most likely that’s not changing daily.

22:31
You may have new products throughout the year, but for the most part that’s pretty static, which is different than your sales fact table.

22:39
So you can by publishing data flows to the Power BI service, you have more control over the frequency of when tables are getting updated.

22:46
So you could say, you know, update the  products table once a week, once a month, update the sales table, you know every morning at your ETL or even throughout the day if that’s something that you’re doing.

22:55
It just gives you a lower level of granularity and control.

22:58
Another thing I forgot to mention was that you know can this also can help remove access to the database directly.

23:03
So if there’s concerns about letting users into your database, you can kind of remove that by, you know, pushing these data flows into the Power BI service, letting them interact with them directly there.

23:14
So you don’t have to worry about granting database access at that level.

23:21
OK, Content administration.

23:23
There are some areas and content that we want to review and lock down for a better user experience.

23:31
The first one is custom visualizations.

23:34
So if you’ve looked around on the web, you may have seen there’s multiple vendors that provide free or sometimes paid visualizations and they do some really cool things.

23:45
I would just offer some caution that sometimes, you know, there are dangers in these external visualizations.

23:52
So it makes sense to kind of lock down these sort of third party external custom visualizations and then you can review them, you know, and add them on a case by case basis down the road.

24:03
Again, there’s lots of settings and things we can lock down in the admin portal, but there is one that I would recommend you do day one, and that is just turn off the you know, or turn on the block uncertified visuals.

24:17
And what I mean by certified visuals to there is always like a blue Starburst.

24:22
I’m not sure if you can see it in the screenshot in the bottom right, but there’s this little sort of blue star next to the different visualizations.

24:30
When you’re looking at them in the store, that indicates that it’s Power BI certified, and that essentially guarantees that there’s not going to be any external calls from Power BI.

24:40
Anything that doesn’t have that, I believe it should be blocked.

24:43
Again, there may be exceptions to that, but you want to make sure that these are Power BI certified.

24:47
So by sliding that toggle on, it’s going to make sure that there’s no external, no uncertified visualizations in there.

24:56
And then furthermore, you can actually customize, add and restrict visuals in the admin portal.

25:02
And doing so, we’ll add them to the list of visualizations that are work that are available in the list when people are building reports in the Power BI service, which is different than the desktop.

25:12
So obviously you can build dashboards and reports in both.

25:15
But when you’re working in the Power BI service, it’s going to be a synchronized view.

25:20
So everyone’s going to have, you know, the same custom visualizations available built in, which is when you’re running on the desktop, there’s not as much control there because they’re not tied down to the service.

25:32
So by doing so, you’re kind of giving it the certified version to all your users who are going to the web.

25:42
Row level security.

25:43
This is, you know, kind of a generic topic, but it’s important here because it’s another great way to roll out data sets to a large audience as it allows you to present the same data to multiple end users.

25:55
We’re only going to see the data that they’re allowed to see.

25:57
So this eliminates the need to have multiple versions of the same data that maybe, you know, are hard coded to only show specific values.

26:04
So you just create, you know, like an America’s version of the data, you know, a Europe version of the data, something like that.

26:10
That’s just unnecessary.

26:12
One single data set with row level security should be able to field all the different security requirements you have.

26:17
So this capability is baked into the desktop tool and there are multiple ways to accomplish this.

26:24
You can hard code like a user group to a filter such as you know if I had a group called like North American Sales and I had a column in my data that was you know based on region, I can hard code that to be north and then depending on your requirements you know that may be good enough like replicate that for North, South, East, West, something like that.

26:45
That works well enough in some cases, especially for a small number of user groups.

26:50
If you’re looking at a much more large scale type of security, you know you need to create 100 or 1000 plus different, you know, ways to secure your data at the role level, it’s probably not going to work.

27:02
So in those cases we use something more like a dynamic role of security option.

27:06
We can grab information from the user at runtime so that we know who they are when they’re logging in and grab their e-mail address or their other Microsoft attributes and use those to kind of secure it.

27:18
So you can use like a security table that you know has mappings between, you know, different users to user e-mail addresses and tie that back to the main data so that it’s only presenting the data that they should see.

27:29
So again, depending on what your needs are, one or both, those might be the option for you.

27:36
And another nice thing is that it’s very easy to test these.

27:38
They allow you to kind of use a view as feature in the desktop tool and you can impersonate the various groups and roles and it’s confirmed that the data is coming back the way you want.

27:47
So before you publish it out make sure you do some testing and use that capability to ensure that everything is working correctly.

27:58
So as I mentioned before, publishing and utilizing workspace apps has become kind of our definite go to when it comes to rolling out and end like end user report primarily because it has in association with the row level security they’ve created what is known as like audiences.

28:21
It’s an additional feature that allows you to segment and really slice the report so that only specific user subsets or groups are able to view them.

28:33
So their power BI’s done a great job in terms of providing a whole host of ways in which you can kind of secure and curate the, you know, end state or user reports so that only people can see.

28:52
Like row level security, you’re slicing the data based on the user audiences, it’s at the report level.

29:00
So you’re able to kind of present a single end URL report that looks like a single dashboard, but you’re able to hide or show different reports based on the security group or I guess silo or department.

29:21
However, you decide as an organization to kind of separate or slice up your end users and these settings.

29:32
This is only done once right when you create the workspace app.

29:37
So if you were to really roll out a single deployment pipeline, more times than not, you would only create a workspace app for the production app or production workspace.

29:51
So you wouldn’t necessarily have to create, duplicate, or replicate one for dev, test, and production.

29:59
That’s just an awful lot to maintain.

30:01
So that’s what we recommend.

30:04
It’s not mandatory, but that’s just our best practice recommendation.

30:10
Yeah, good point Steve.

30:11
Thank you.

30:12
Yeah the audience capabilities that they’ve added are really nice to and very user friendly as far as setting up to make sure that you know users are seeing the right slice of their data.

30:20
So if you haven’t looked at apps, I would recommend checking them out.

30:26
Last topic of the day is just some overall guidance on external internal sharing.

30:33
Again, this is a personal preference again, but there are lots of features in Power BI that are enabled by default that I believe should be disabled at least in the beginning.

30:43
You know, obviously there may be a need by need, you know, to enable these as they come up, but it’s much easier to, you know, make exceptions, as I said earlier, to one of these rules than to kind of and take it away from everybody after it’s already been rolled out.

30:57
So things like publishing to the web, embed codes, exporting things out of Power BI, guest user access, you know, business to business, letting guests share and collaborate, copying and pasting visuals, printing reports and dashboard, all these things are by default allowed out-of-the-box with Power BI.

31:18
You know, I’ve kind of harped on a couple topics over the last few slides, but you know, one of them being, you know, we want to kind of keep things in Power BI we don’t want to duplicate.

31:26
And all these things do is they kind of take stuff out and they’re static snapshots, they’re no longer In Sync.

31:30
So.

31:31
And if people like to dump things to PDF and print things out, and that’s always been something people have done.

31:36
But I think it’s a better way to kind of take it away and wait for people to ask and make exceptions to specific user groups who need to do that vs just letting everyone do these things.

31:46
So just my two cents on these topics, I’m going to turn it back over to Steve to close out with some final thoughts.

31:56
So at the end of the day, this entire process, whenever you’re rolling out an environment or you know Power BI reporting infrastructure, this is not a one time event.

32:09
It’s really meant to be an iterative process and kind of a feedback loop.

32:14
So you always want to set it up and set expectations so that people know that you as an organization are always in a position to implement and roll out the monthly updates that Power BI does consistently.

32:32
And you guys are able to access and utilize all the newest, latest and greatest features.

32:40
So you really want to and you want to get feedback from the end users in terms of you know how they’re doing with it if you need to make some updates or changes.

32:49
And so it just thinking about it from a process standpoint rather than a one time event really helps.

32:59
So whether that’s setting up a shared document in terms of getting feedback or even utilizing the comments feature which is a little bit more advanced.

33:08
But that’s something that I recommend just in terms of making sure that it’s set up for the vast majority of your end users to get the most out of their Power BI experience.

33:27
SRP, you want to take it away.

33:32
Sure thing.

33:32
Thanks Steve.

33:34
Just a little bit of a wrap up stuff here before we get into the Q&A.

33:39
In terms of additional resources, we have tons and tons of free resources on our website.

33:45
We always encourage you to go there and check that out.

33:47
We’ve got tips and tricks, we’ve got blogs, we’ve got demos, we’ve got presentation decks and recordings of prior webinars.

33:56
Hundreds of free resources on our website at senturus.com/resources, so please check that out.

34:02
We also have a few upcoming events want to make you aware of next week.

34:07
We’ve got Microsoft Fabric and you, so if you’re interested in the latest and greatest about Microsoft Fabric, join us next Thursday.

34:15
We’ve also got a chat with Pat on publishing too, and using the Power BI service on October 18th and another chat with Pat for those of you who are hybrid Power BI and Cognos shops.

34:29
We’ve got a chat with Pat on Cognos 12 coming up in November and you can register for any or all of those at senturus.com/events.

34:39
A little bit about us as a company here at Senturus.

34:44
We provide a full spectrum of analytic services and enablement.

34:48
We also offer proprietary software to help accelerate bimodal BI and migrations.

34:55
We shine in hybrid BI environments.

34:57
We’ve been in this business for a long time.

35:00
No matter how big or how small your project, we can provide the flexibility and the knowledge to help you get the job done right.

35:09
As I said, we’ve been in this business for a long time.

35:12
We’ve been focused exclusively on business analytics for over 20 years.

35:17
Lots of clients, lots of projects, lots of experience.

35:21
We’ve got a team that’s big enough to meet your needs, but also small enough to provide personal attention.

35:27
So do reach out to us.

35:29
We’re always happy to help and eager to help you with your business analytics environments.

35:35
With that, we’re going to jump into a little bit of Q&A here.

35:38
I know Steve already answered some questions in the Q&A panel in real time.

35:44
So I’m just going to kind of jump around here and not really necessarily in the order that things were presented, but there are a couple of questions here about audiences.

35:57
Let me find my Wow.

35:58
A bunch of questions are coming in now.

36:00
So the first question that caught my eye was whether audiences are similar to the bursting capability and other tools like Cognos.

36:09
I don’t know if one of you can address that.

36:12
I’m way out of my league here with Power bi stuff, so let’s see if our experts here have anything to say on that.

36:23
So what was the question is whether audience is, are audiences in Power BI similar to the bursting capability and other reporting tools.

36:34
So Steve, I know you’re not a Cognos guy, so bursting in Cognos enables you to essentially send out like a customized report that’s using row level security based on like a group or a like a user role.

36:47
So that say like district managers get a bigger set of data in their sales report than you know like a I don’t know if the right words got it like a city manager for example.

37:02
Yeah.

37:02
So in a way, yes, you can set it up so that you know a specific role or position can see a certain number of reports.

37:14
But just to be clear, it’s not, similar in the sense that it’s setting up an outbound kind of e-mail report.

37:24
It’s just setting it up so that only specific role can see, you know, five out of the 10 reports in a workspace rather than all 10 reports.

37:36
And again, it’s only at the report level and not you can use it in combination with row level security, but the audiences are simply, you know, hiding or showing certain reports for specific groups that you kind of make up off the within the app itself.

37:59
Yeah, OK, so it’s more about report visibility and not about the exactly data that’s in those reports.

38:06
Yeah the bursting is going to give you almost like row level slices of reports delivered directly to your inbox usually or you know like a local file where this is going to be more of just a higher level of you know what pages of a dashboard do you see that relevant to your role.

38:22
So it’s a little bit similar but it doesn’t go as far and beyond as in as detailed as a like a burst would be cool.

38:33
All right.

38:34
We’re going to go on to another question Sabrina asks here.

38:38
Let me just share this out.

38:39
So what would you recommend for production workspace with many reports?

38:43
Should they all be used by the same audience or would you create a new audience based on the type of report?

38:50
That’s probably your arena, Steve, I think, yeah, so audiences again, for give an example.

39:00
So the production workspace you would create the audiences for maybe like the last example specific roles like account managers or even department level finance, marketing etc.

39:19
So that when a user logs in to the app or goes to view the app, if they are only supposed to see financial reports, you can hide every report except the financial reports and even an excel file.

39:34
And then as an admin or you know a maybe a different user such as marketing, they’re only able to see marketing reports and not see any financial related reports.

39:48
So it’s it really is just at the report level.

39:52
You’re not filtering the data going into the reports, but you’re just simply hitting the I show/hide icon at the report level.

40:04
I don’t know it.

40:08
That clarified things, but yeah, let’s take a look at some of these other questions here Kayla asked about with regard to sharing workspaces.

40:24
Is there a way to prevent sensitive data sets from being shared to unauthorized users?

40:35
Yes, So this is where you would actually utilize the endorsements/sensitivity, labels.

40:45
And in the settings of each data set, you’re able to kind of explicitly say whether or not you want the end users to be able to export the data, reshare the data set internally within your organization or just basically set it up to read only.

41:08
But that’s kind of the in between setting that you can set it up, configure it so that people, if you want them to see it, they can see it.

41:21
But if you only want them to read it and not actually share it externally or even within the organization, you can prevent that as well.

41:30
So that’s the kind of lock it down and prevent people from being able to export sensitive data even into Excel.

41:37
So that’s within the data set itself.

41:41
And I actually took out a slide.

41:42
There is capabilities in Power BI to integrate with Microsoft Purview, which has the ability to create custom labels like private confidential things like that, additional labels outside of the, you know, promoted or certified that I mentioned in the in the slide deck.

41:59
I didn’t want to get into the whole purview topic, but if that’s something you’re interested in as well, there are additional capabilities that you can use to do so.

42:12
Cool.

42:13
Thanks guys.

42:15
Another question here from Sabrina.

42:18
Are imports using an On prem gateway still secure when you’re publishing to the web?

42:25
We’re using embed code.

42:26
I’m not sure I understand that’s part of that question.

42:29
But just in terms of security and I think this is also your camp, Steve, so are those.

42:37
So publishing to the web and using the embed code the sources when utilizing on premise gateway, it’s still secure, but you need to make sure that you are publishing to the web inside a secure portal if that’s what you want to do, or embed it in an existing organizational portal.

43:01
So the you just want to make sure that you’re not publishing it to a public forum essentially and it you don’t want to heavily rely simply on the security settings that are applied to an on premise gateway to do the security on the other end from an end user point of view.

43:28
So you just want to make sure that you’re not embedding it or publishing it to an unsecured portal at the end of the day.

43:41
Great.

43:42
Thanks Steve.

43:44
Let’s see here Brian asks, does Power BI have a subscription component where it can deliver content via e-mail.

43:52
So this is a little bit like how bursting is often used in Cognos.

43:57
So I think in the past there wasn’t a feature but there may be now.

44:03
So it’s you can there is a new component within Fabric that they just recently released known as Data act like Data Activator.

44:16
You can utilize that to set up a recurring e-mail report if you want.

44:23
It depends on how elaborate you want it to get, but the end users can actually in addition go in and hit subscribe at the top of in the toolbar at the top of the app.

44:36
That will allow them to receive kind of a data triggered alert emails either on a schedule or if a certain metric goes above or below a predefined number.

44:51
So there are multiple ways to do it, but at the end of the day without doing anything, the end users can still go in and hit subscribe.

45:01
So there are varying levels of complexity.

45:05
But yeah, the subscribe feature or button is probably the one of least resistance.

45:14
Great.

45:14
So nice of that capabilities in there.

45:18
Got a couple of questions here from Penny.

45:22
Quick short one to turn on sharing of data sets to disable some of the features, do you have to be in Office 365 or MS365 or Fabric or Power BI Admin?

45:34
That’s just so they all those things.

45:35
Yeah.

45:36
Power BI Admin is a you know a subset of the Power platform in Office 365.

45:41
So all three of them can do it, but all you need is the Power BI Fabric administrator capability to control almost all the settings I mentioned today are fabric administrator type permissions that you can do at that at that level.

45:55
Obviously you can do at the higher level as well as they have you know they kind of waterfall upwards.

45:59
But yeah, none of that is needed to do those things.

46:04
OK, so but the minimum, Todd, you said is Power BI or Fabric Administrator, correct role.

46:09
OK, all right.

46:12
Penny also says there seem to be many things that can only be done in Power BI Desktop.

46:19
Our data set in the Power BI Services import mode and is only refreshed daily.

46:24
Contributors cannot use Power Query Editor.

46:26
Is that because it’s using import mode?

46:31
That’s really they are doing their best to move everything that can be done in desktop to the online/power BI service experience.

46:45
But it’s really a permissions thing and so anything that you can do in desktop should just be you are correct.

47:00
And in terms of desktop only being import mode accessible or creating any import mode experience creating any new reports in the service, you actually are kind of capped or help to utilizing existing director quick direct query/live report connections of existing data sets.

47:25
So it’s kind of a yes and no answer, but to be clear, the import mode feature is only accessible via the desktop version.

47:44
Thank you guys.

47:46
A couple of questions about apps.

47:49
So Paul asks, is it limited to one app per workspace or can you have multiple apps in the workspace?

47:58
So workspace apps are held to a 1 to 1 workspace to app ratio.

48:03
So yeah, you can only create one app per workspace per workspace.

48:08
Yeah right.

48:11
And then Benjamin has an app related question also.

48:14
So he says we separate data sets and reports in different workspaces.

48:20
Do you still recommend apps to control access to the data sets?

48:26
I do.

48:27
I think if you if you’re utilizing workspaces to segment the data sets, I think it’s still beneficial to maybe even either replicate or create a new front end report within a different workspace that you’re only going to use for the kind of the end users.

48:56
So that you’re able to still use the segmentation and workspaces for different data.

49:03
But at the end of the day you’re still able to use a workspace to create the end user reports and kind of use those as the back end and the workspace as kind of the front end.

49:20
So you’re able to leverage both the benefits that come with well sorry small asterisk that cross workspace utilization in terms of data sets can only be done via the premium per user or the premium capacity licenses just as an FYI.

49:49
So I recommend that if you have a premium per user or a premium capacity license.

49:55
If not, then you’re correct.

50:00
I wouldn’t necessarily recommend doing all of that work if you have pro licenses.

50:09
Sorry for the confusion.

50:10
That was a good question.

50:14
Thanks, Steve.

50:16
Brian has a question here that’s near and dear to the hearts of many administrators.

50:22
As for the periodic auditing component, are there reports or queries against Power BI metadata that provide insight into how the environment is actually being revised?

50:33
Is any of that readily available to admins?

50:40
The metadata of Sorry, yeah I think what Brian is asking, is there a way to see what the like relative utilization of different pieces of content are in your Power BI environment?

50:53
Why can you tell?

50:53
Oh, this query has been executed 100 times in the last month or so for auditing purposes.

51:02
There is a like a workspace utilization report that I highly recommend using.

51:11
It’s called like a user utilization report and it actually gives you like end user row level transparency into which users are opening up and viewing certain pages of reports, certain reports.

51:34
If there’s a bottleneck in terms of a long running query that’s kind of bottlenecking the entire workspace or capacity, you’re able to get some insight into that.

51:46
But and like I mentioned at the end in terms of it being a ongoing iterative process and not a one time event, I think utilizing that user metrics report at the workspace level is very, very important and I think it’s best practice to do that.

52:11
Thanks, Steve.

52:14
All right, we’ve got more and more questions coming here.

52:16
So another question from Brian, can you nest workspaces inside of one another to help with organization and or is that on the road map?

52:26
I know you sometimes have insight into what’s coming Steve on the Microsoft side, but can you nest workspaces today and or do you know if that may be possible in the future.

52:38
So there’s nothing on the road map specifically about nesting the workspaces right now.

52:46
But like I had mentioned a little bit earlier, if you have premium per user or premium capacity licenses, you are able to kind of set up that nesting functionality and cross workspace utilization of data sets like outside of one single workspace.

53:09
And so you can kind of do it yourself, but if you’re thinking about rearchitecting an existing workspace or infrastructure, that’s something that I can have a separate conversation about if you’d like.

53:30
Cool.

53:30
Doug asks kind of a follow on question to the your earlier comments about auditing, Steve.

53:36
He says when you answered the question about usage auditing, it seemed to pertain to the single workspace and is it possible to see that kind of data for an entire tenant or is that workspace specific.

53:50
So it’s workspace specific in terms of the utilization report that I was referring to previously.

53:58
But there is also they need to release a newer version of it.

54:03
But if you go into the admin portal of your capacity/tenant, there’s a, an embedded dashboard that actually shows you similar metrics in terms of utilization and the metadata being used at the user level.

54:25
So I think just using both of them in combination will give you very good insight into you know which resources are being used, which are you know overextended, which ones are bottlenecked etc.

54:43
So I know they are actually working on an updated version of the embedded like capacity/tenant level utilization report though.

54:59
Great.

55:00
Thanks, Steve.

55:02
Let’s see what else.

55:03
We’re getting low on time here, but we’ll get a couple more questions now.

55:07
And I’m going to take another Doug question here.

55:11
I feel like we’re playing stump the expert with you, Steve.

55:13
So far you’re doing great.

55:16
So we’ve got, you mentioned using certified data sets.

55:20
It looks like the only way to allow a user to use a data set to develop a report in their workspace is to make them a member of the workspace that contains the actual data set.

55:34
How then would you keep them from editing content in that space?

55:39
So if you want them to actually be able to have the freedom to create those reports along with the cross workspace data sets, you’re kind of in a position where you need to make them an admin in order to work really work with those data sets as needed.

56:03
But at the same time I would utilize or really map out the roles, access and really sensitivity labels so that you can set up row level security specifically for them if necessary.

56:19
So there are a lot of, you know, pulleys and levers that you can kind of set up to make sure that people are only seeing what they should be.

56:25
But in terms of the role, there isn’t, you know, a role specific like member, contributor or admin that will give you exactly what you’re thinking of, right?

56:39
Like right out of the gate.

56:42
So you kind of need to really work through what people or certain people should have access to what you want them to do.

56:48
And then you need to kind of tweak all of the row level security, the object level security, audiences and workspace level permissions so that they can only see what you want them to see, but also they can so that they can build what you want them to build.

57:10
So it is possible, it’s just you need to really clarify and understand what your requirements are for that user.

57:23
And that can also kind of play into deployment pipelines too.

57:25
You know, you could have a higher level of being able to publish and create content in a development folder that’s not visible to end users and go through the testing process of promoting it upwards when it’s, you know, verified.

57:38
But yeah, there’s a lot of kind of complexity to that type of question.

57:46
Good question.

57:46
All right, thanks guys.

57:48
So we’re at the top of the hour.

57:50
So I’m going to going to wrap things up here.

57:52
If we didn’t get to the question that you posted in the Q&A panel, we’ll follow up with you after the webinar.

57:58
We typically will also post a kind of a collection of the answers to the things we addressed here.

58:04
So you can keep an eye out for that.

58:07
And with that, I want to thank Todd and Steve for being here today and presenting and answering lots of great questions.

58:16
And thank you all for joining us.

58:18
It’s always great to have you here on our webinars.

58:21
You can always reach out to us.

58:23
You can find us at senturus.com, you can e-mail us [email protected].

58:29
And Todd, if you just jump to the next slide, just so people have that info, You can even call us if you feel like picking up the phone.

58:36
We’ve got an 800 number.

58:38
Always happy to chat with you also.

58:40
Kay Knowles has been posting her calendar link into the chat.

58:45
So if you don’t see that, you can go ahead and click on that link If you’d like to talk to Kay and or Todd or Steve, we can get some time on the calendar to talk about your Power BI and governance needs.

58:58
So with that, thank you all for joining us today and we hope to see you again on a future Senturus webinar, Todd, Steve, thank you again.

59:09
Thanks guys.

Connect with Senturus

Sign up to be notified about our upcoming events

Back to top