Power BI Governance & Administration: Keys to a Successful Rollout

Data governance and administration are essential when implementing Power BI. These two critical functions ensure data consistency, security and accessibility. They enable fast, reliable reporting, support data collaboration and sharing, and minimize risks associated with data misuse. 

Many organizations adopt Power BI without a proper data governance or admin strategy in place. Only later do they discover that their reporting lacks control, which in turn leads to a lack of a definitive source of truth for critical business decisions.  

Watch our on-demand webinar to learn more about advanced Power BI settings, controls and methodologies that will help ensure the success of your Power BI rollout. We discuss these important topics:   

  • Role delegation 
  • Workspace administration 
  • Data governance 
  • Content administration 
  • Internal/external sharing 

Put the right controls in place BEFORE you implement Power BI and your roll out will be red carpet worthy. 


Todd Schuman
Installation, Upgrade and Performance Tuning Practice Lead
Senturus, Inc.  

Todd has over 20 years of business analytics experience across multiple industries. His expertise lies in helping customers architect analytic environments that are high performing yet simple to and easy to use. 

Steve Nahrup
Sr. Microsoft Solutions Architect
Senturus, Inc.

Fluent in the Microsoft stack, Steve has more than a decade of experience overseeing the architecture and design of data warehouse, BI, reporting and dashboard solutions. Curious about new technologies, he is constantly downloading free trials of new platforms and arranging meetings with their product teams to discuss an ongoing relationship where he is granted a free license in return for ongoing feedback and thoughts on the current state and future releases.

Machine transcript

Hello everyone and welcome to today’s SENTURUS webinar on Power BI administration and data governance.

Keys to a successful rollout, quick bid on our agenda for today, we’ll do some introductions of today’s presenters, talk a bit about roll delegation in Power BI, workspace, administration, data governance, content administration issues or not issues, but mechanisms for internal and external sharing.

We’ll do a quick overview of Senturus as a company and some additional resources we have to offer.

And as I said, we’ll do some Q&A at the end.

Our presenter is for today and 1st up we have Todd Schuman.

Todd is a BI Solutions Architect here at Senturus.

Todd has over 21 years of business analytics experience across multiple industries.

He is a Microsoft certified Power BI Data Analyst and Todd’s expertise lies in helping customers design analytic environments that are high performing but also simple and easy to use.

Todd lives with his wife and his two daughters outside of Washington DC and Virginia and we also have Steve Nahrup here.

Steve is our practice lead for the Microsoft Fabric practice here at Senturus.

Steve is a Microsoft Solutions architect fluent in the entire Microsoft stack.

Steve has more than a decade of experience overseeing architecture and design of data warehouse, BI reporting and dashboard solutions.

As for me, I’m Steve Reed Pittman, Director of Enterprise Architecture and Engineering.

As usual, I’m here to do the intros and kind of keep things moving along.

But for the most part, you’re going to be hearing from Todd and Steve today.

Todd will be doing the book of the presentation and Steve will drop in for a few of the slides here and there.

So you’ll be hearing from both of them.

Before we get into the meet of today’s session, we’re going to run a quick poll.

Let me go ahead and start this up.

My poll window disappeared on me.

So one moment.

Here we go.

All right, so today’s poll is how confident are you that your company has properly secured governance across your current Power BI environment?

Instead, you feel out your Power BI environment is currently very secure, moderately secure, fairly unsecure or completely unsecure.

So the answers are coming in.

It looks like the bulk of you right now are coming in kind of around the moderately secure level, but there’s a little bit of everything.

So clearly there is a lot of variation out there and that isn’t uncommon there.

There’s a lot to look at and a lot to consider when you’re securing your environment and trying to implement to get governance.

So again, it looks like about 2/3 of you feel like your environments are moderately secure and I’m a handful of you have very well secured environments.

Some of you think completely insecure and if you have you also down that fairly insecure zone.

So I’m going to go ahead and stop sharing these.

And with that, I’m going to be quiet and hand it over to Todd.


Thank you, Steve.

I did want to go back to the intro slide for a second.

And I apologize for not being named Steve and having a beard slash goatee.

I think this could have been the greatest webinar ever if we were all named Steve and had similar facial hair.

So total missed opportunity.

But maybe next time.

Anyway, time to get serious.

Jumping into the meet today, they want to touch on licensing.

It’s a whole other beast.

I’m not going to get into all the specifics today.

It’s also constantly changing with some of the fabric announcements that have been woven into the discussion.

These are some links here that are two different areas of the Microsoft website where you can kind of get the latest information directly from the horse’s mouth.

If you need any additional help or have questions, be sure to reach out to Kay in the chat and we could set up a call to discuss.

I’m going to get into some features today and capabilities that do require specific license types.

So I’m going to try to call those out just so you’re aware of them.

But there is a lot of information and different flavors of the pricing and licensing out there.

So use these links and hopefully you can kind of get a little more insight to those administrator types.

There are essentially 3 main ones that you need to be aware of.

There is the Microsoft 365 Global Administrator.

This is the very top level.

It allows you to create users, security groups, assign additional licenses.

It also gives you full control over the lesser platforms like Power Platform and Power BI.

The next level below that is the Power Platform Administrator.

This gives you all the admin capabilities around Dynamics 365, Powerapps, Microsoft Flow, as well as full control over Power BI.

And then the bottom one which we’re going to be mostly focused on today is the Power BI administrator, which is now known as Fabric Administrator.

This is going to give you access to Power BI Desktop, the Power BI service.

Power BI Mobile allow you to set up tenants, but it does not give you those upper levels like Power Platform, Office 365 and again just a little asterisk here.

Fabric is changing things almost at a daily level.

It’s lots of constant updates to these capabilities so as today this is currently what it is, but I would recommend you know bookmarking some so those sites on the prior slide just so you can be aware of any constant changes that are happening.

Admin groups are all configured through the Office 65 through 65 console.

To do so, you could just go into your users, select Manage roles, and you are presented with a big old list of types of administrators.

I kind of highlighted here the three I’m talking about.

So you can see it’s now called Fabric Administrator, used to be called Power BI.

There’s also the Global Administrator and the Power Platform Administrator.

So depending on how you want to allocate those groups, it’s all kind of done through this, this screen here.

And then one of the most common mistakes people make when rolling out Power BI is just to kind of open things up.

Let users jump in before they figure out the security and the user groups.

And this is a big mistake.

It’s very important to determine the admin groups and your user groups before you roll things out.

You need to define these global administrators, your Power Platform administrators, your fabric administrators, as well as spending time creating things now like your workspace administrators, your data stewards, your testers, developers, and consumers by various business functions.

I’m going to touch on a bunch of these user type groups in a couple of slides, but I just wanted to say, getting ahead of this and spending some time up front to figure out how you want to secure and roll out Power BI is going to make a huge difference when you let users in, and if you have this already set up in place, it’s going to make your rollout much more successful.

Outside of manually granting groups and user access, there’s something called Privileged Identity Management.

This is a time based, approval based, role activated tool that allows you to grant temporary access to workspaces and data sets.

It really helps reduce the risk for excessive or unnecessary misused access.

It allows you to define start and end dates so you don’t come back later and remove you don’t forget to come back later and remove somebody who should have been removed as an administrator.

So again, this is a newer feature.

It is limited by a few prerequisites.

Specifically, you need a Power BI Pro license and you have to have an Entra ID P2 if you have a Microsoft 365 E 5 subscription.

The good news is these are both included so you’ll be able to do this right away.

And just for reference, Microsoft Entra ID used to be known as Azure ID Azure AD.

For some reason, Microsoft decided to make things even more complicated than they normally are by now renaming Azure AD to Entra ID.

So if you see that you know going forward, just know that it’s essentially just Azure AD renamed.

But if you do have those two capabilities, or there’s the E5 license overall, you’ll be able to use these privilege identity management capabilities which are really nice.

Going to jump into some workspace administration.

Power BI workspaces typically contain various objects like dashboards, reports, excels, cell workbooks, data sets, data flows.

You probably have some experience with these currently.

Just to clarify a couple things, this is a little bit different than the my workspace area and I refer to My workspace.

I’m not referring to the crusty old desk.

I said it every day is surrounded by old coffee cups and empty bags of funions.

I’m talking about a folder in the Power BI service that is visible to just myself.

There’s no outside access, just I’m the only user who can kind of see in here.

So there’s no collaboration.

It’s not something you really need to worry about securing.

It’s already kind of secured just for you and you alone.

Outside of that, we have public workspaces that are made to share your dashboards, workbooks, data sources, etc.

It doesn’t mean that you should open these up for users to go wild.

Workspace proliferation is the number one issue that we run into on failed Power BI rollouts.

By not locking down the ability to create new workspace, you’re letting users create duplicates and conflicting information that’s extremely difficult to undo once it’s begun.

Even if you are OK now with your rollout, you don’t put some restrictions in place today.

Things are going to get out of hand before you know it.

The good news is that it’s very easy to prevent this with just a couple clicks. In the Power BI service.

As an administrator, you have access to the admin portal and there’s an area on your tenant settings around workspace settings.

What you want to do is you want to prevent the entire organization from being able to create these workspaces, and instead you want to delegate these to a specific group I mentioned.

You know, earlier on the slide about creating groups and roles?

Something called like a workspace administrator would be a good idea, obviously.

Call whatever you want.

So a group of people who are responsible for went deciding when and where to create additional workspaces versus just letting people go and put out, you know, Todd’s reports, Bob’s reports, Steve’s reports, and just having all these folders and workspaces floating around with duplicate dashboards, data sets, etc.

You want to make sure that not happening.

So by creating this workspace administrative group and granting them the only ones who have the ability to do this, you’re going to prevent that from happening.

Steve is going to go over some workspace access and permissions issues, so I’m going to turn it over him for this slide.

Hey guys, so when it comes to creating access or allowing access to end users for reports, we’re going to get into this in a little bit.

But they’ve really made it so that you don’t have to provide every single user direct access to the workspace and get into apps very soon.

But like Todd mentioned before, you really don’t want to set yourself, put yourself in a position where you have to retroactively do things.

And so when it comes to assigning these roles, Contributor, Member, Admin, instead of doing it at a user level, we highly, highly recommend that you create Azure Security Groups and then you assign each user to a specific role/security group.

And then you are able to go in and just one time set up the user permissions in the workspace or the app like we’ll see in a second.

But Admins should be very rare.

Those are the only people that should be actually creating, publishing and, you know, deleting reports and then also members.

Those are ones that should be able to see everything in the workspace and contributors.

I recommend really not giving them direct access to the workspace, but to the workspace app.

So definitely the contributor is the most prevalent or common members second most.

And then the admin is again the rarest permission that you should provide.

Thanks Steve.

Up next is deployment pipelines.

So if you are old like me, you probably at some point had a BI tool that lived on premise somewhere and you had three or more environments dev, you know, test/QA, production, maybe additional ones.

But since we don’t have separate environments in Power BI SAS, we can simulate that same approach using workspaces that are set up and secured for various business units and to have a dev, test and prod area workspace for each.

You can manually create these workspaces an administrator and move content around as needed and secure them, you know 1 by 1.

Or if you have a Pro license and a PPU or a premium embedded capacity, you can use this deployment pipelines capability.

And these are really handy.

I think you can define parameters in your content to take advantage of the different servers, database names, schemas, etc.

And then map them to the specific deployment pipeline area.

So if you have a data source that points to SQL Dev, SQL QA and SQL Prod, you can have a parameter in your content.

You define that in the settings of these and once you define them, you can just click a button and say deploy a test, deploy to production and then update your apps.

And it makes it very seamless and easy to kind of move stuff between those different areas.

You also can define the user groups.

So I mentioned again earlier, you know having like a tester group, a developer group and then you’re obviously you’re in your business user groups assigned into the appropriate buckets here and spend the time in front to dedicate these to the groups and everything.

And then once it’s all set up, it’s super easy to just click a button and deploy things when they’re ready.

So very cool feature.

It is again tied to a couple elevated license capability type things.

But if you can, if you can get this or even just for a user or two who’s going to do your administration, I think it’s worth the cost.

One thing to note really quick about that one is because you have three different workspaces, you shouldn’t necessarily feel obligated to grant everybody access to all three.

More times than not, it’s a handful of people that have direct access to test and dev and then the end business users should only have access to production most likely.

So that’s it.

Yeah, exactly.

Thank you.

I’m going to jump into some data governance, talk about some self-service.

So we can’t have self-service without data governance.

And this has never been more important today as more and more people are moving away from traditional IT based controls to self-service.

So how do we do this?

We have to push users to a single centralized data set that has been certified by your data steward group as a trusted, single version of the truth.

And to help support this early on, we can also pull in and upload Excel files and centralize them in Power BI to get the Excel community to help buy in.

The single version of the Truth is a term that’s been around for years.

In the past, you know, IT would publish out models and then users would build reports and dashboards against them and they typically had a high level of confidence in the data they were using.

The drawback to this was, you know, it was a slow process.

It usually required lots of back and forth between the business users and the IT.

Sometimes those, you know, details were lost in translation and it was just slow to kind of get those updates as you needed them.

Fast forward to today, users can now correct connect directly to databases.

They can create their own calculations and filters, publish them right out to Power BI service.

The downside to this is that not all users really understand or grasp all the complexities of the database or best modeling practices and this can lead to incorrect data and incorrect calculations as well as duplication as all users are kind of just going on their own and just kind of using their own data sets.

So somehow we have to find a common ground.

And the good news is that with Power BI it does give us controls to do so with a few clicks.

The first thing we want to turn on is the sharing of datasets across workspaces.

If you aren’t sharing your workspaces it’s going to lead to having multiple copies of the same dataset spread across multiple workspaces and users are going to be constantly publishing new datasets because they can’t find one, find one that was already available because the sharing wasn’t there.

And by sharing a dataset that is promoted and or certified across workspaces, you just can then connect directly to that in Power BI Desktop or in the Power BI service and run and create content right off of that.

It helps eliminate the duplication that would normally occur and it also removes the need to allow users to directly connect to databases as the data sets are already exposed in the fields and calculations they need.

And then finally, the other benefit would be you know that you can schedule your data refreshes directly on the server and that can help keep your data fresh or as fresh as you need it to be using scheduling.

I mentioned endorsements earlier and there are two endorsement statuses to be aware of.

The first is promoted as a data set owner.

You can promote a data set within your organization.

This allows other users to see that there’s a new data set available.

A quick heads up on that is just because data set is promoted doesn’t necessarily mean it’s fully trusted or vetted.

So you need to be careful with that and to kind of have that next level of trust is called certified.

Certifying A data set is something that should be assigned to a custom security group as well.

There are settings in the admin console to do this.

So don’t let your you know user base go and be able to certify things on their own.

You want to have a group like a data stewards group.

You want to make sure that your users are not able to do this as the certified tag is going to lose credibility if anybody can just go and certify something.

So a trusted group of data stewards or whatever you want to call it, people who really know the data, who have spent some time testing the numbers, validating that the calculations everything look correct is going to be the best option here.

And then lastly, one other benefit of endorsements is that by certifying it, it’s going to cause it to bubble to the top of the list.

So when you just say connect to, you know in this screenshot here, get data Power BI data sets by having it certified it’s going to bubble at the top of the list when they search for stuff.

So it’s very easy to find them and just for other additional you know promoted gets put underneath them as well.

So certified first, then promoted, then everything else.

So these two tags can help, you know, help user identify what they want to connect to and use for building reports and hopefully eliminate the need for them to go out and do it on their own or kind of recreate something that’s already been done.

Good old Excel BI tools have been trying to replace Excel for years, but it’s still here and it’s not going anywhere.

And like most organizations, especially in the beginning of a new BI tool rollout, there’s going to be a user base that’s still heavily reliant on Excel.

So the good news is that Microsoft makes both these tools and they do work well together.

Excel workbooks can be directly uploaded into Power BI Service, and you can then use the Analyze and Excel features, which help bring us back to this centralized approach where everyone is using the same workbooks collaboratively and people are getting away from these local files that live on their desktops and they’re just kind of going rogue and maintaining their own, their own spreadsheets.

So we want to kind of get to the centralized concept here.

I would recommend as far as this setting as to make this sort of Excel process a one way street.

you don’t want users to upload Excel and do some work and then export it back out, because then you’re leading to more duplication.

You’re again no longer In Sync with this sort of centralized view.

So obviously special circumstances may apply to do the alternative that people, you know, export stuff out of here.

But as a general of thumb, I prefer as far as all security is to kind of lock things down maybe more than you would like, and then poke holes and make small exceptions for specific rules that need to kind of break that overall general rule.

But it’s always easier to kind of lock something out and let something out slowly than the alternative.

And then the last sort of topic on this, this area is data flows.

These can also be centralized and published to the Power BI service.

There’s several advantages to doing so.

Number one, you know you have the ability to publish out the power query and transformations to the workspaces and this allows it to be collaborative so the users can go in who have access to it.

And if there needs to be changes to the transformations of the data, it doesn’t have to just be you in your local version on your desktop.

Other people can do it as well, assuming they know what they’re doing.

So it’s a much better way to kind of manage that.

The nice advantage to this is to have a lower level of control on the data that’s being refreshed.

So for example, if you have in your data set a product dimension, most likely that’s not changing daily.

You may have new products throughout the year, but for the most part that’s pretty static, which is different than your sales fact table.

So you can by publishing data flows to the Power BI service, you have more control over the frequency of when tables are getting updated.

So you could say, you know, update the  products table once a week, once a month, update the sales table, you know every morning at your ETL or even throughout the day if that’s something that you’re doing.

It just gives you a lower level of granularity and control.

Another thing I forgot to mention was that you know can this also can help remove access to the database directly.

So if there’s concerns about letting users into your database, you can kind of remove that by, you know, pushing these data flows into the Power BI service, letting them interact with them directly there.

So you don’t have to worry about granting database access at that level.

OK, Content administration.

There are some areas and content that we want to review and lock down for a better user experience.

The first one is custom visualizations.

So if you’ve looked around on the web, you may have seen there’s multiple vendors that provide free or sometimes paid visualizations and they do some really cool things.

I would just offer some caution that sometimes, you know, there are dangers in these external visualizations.

So it makes sense to kind of lock down these sort of third party external custom visualizations and then you can review them, you know, and add them on a case by case basis down the road.

Again, there’s lots of settings and things we can lock down in the admin portal, but there is one that I would recommend you do day one, and that is just turn off the you know, or turn on the block uncertified visuals.

And what I mean by certified visuals to there is always like a blue Starburst.

I’m not sure if you can see it in the screenshot in the bottom right, but there’s this little sort of blue star next to the different visualizations.

When you’re looking at them in the store, that indicates that it’s Power BI certified, and that essentially guarantees that there’s not going to be any external calls from Power BI.

Anything that doesn’t have that, I believe it should be blocked.

Again, there may be exceptions to that, but you want to make sure that these are Power BI certified.

So by sliding that toggle on, it’s going to make sure that there’s no external, no uncertified visualizations in there.

And then furthermore, you can actually customize, add and restrict visuals in the admin portal.

And doing so, we’ll add them to the list of visualizations that are work that are available in the list when people are building reports in the Power BI service, which is different than the desktop.

So obviously you can build dashboards and reports in both.

But when you’re working in the Power BI service, it’s going to be a synchronized view.

So everyone’s going to have, you know, the same custom visualizations available built in, which is when you’re running on the desktop, there’s not as much control there because they’re not tied down to the service.

So by doing so, you’re kind of giving it the certified version to all your users who are going to the web.

Row level security.

This is, you know, kind of a generic topic, but it’s important here because it’s another great way to roll out data sets to a large audience as it allows you to present the same data to multiple end users.

We’re only going to see the data that they’re allowed to see.

So this eliminates the need to have multiple versions of the same data that maybe, you know, are hard coded to only show specific values.

So you just create, you know, like an America’s version of the data, you know, a Europe version of the data, something like that.

That’s just unnecessary.

One single data set with row level security should be able to field all the different security requirements you have.

So this capability is baked into the desktop tool and there are multiple ways to accomplish this.

You can hard code like a user group to a filter such as you know if I had a group called like North American Sales and I had a column in my data that was you know based on region, I can hard code that to be north and then depending on your requirements you know that may be good enough like replicate that for North, South, East, West, something like that.

That works well enough in some cases, especially for a small number of user groups.

If you’re looking at a much more large scale type of security, you know you need to create 100 or 1000 plus different, you know, ways to secure your data at the role level, it’s probably not going to work.

So in those cases we use something more like a dynamic role of security option.

We can grab information from the user at runtime so that we know who they are when they’re logging in and grab their e-mail address or their other Microsoft attributes and use those to kind of secure it.

So you can use like a security table that you know has mappings between, you know, different users to user e-mail addresses and tie that back to the main data so that it’s only presenting the data that they should see.

So again, depending on what your needs are, one or both, those might be the option for you.

And another nice thing is that it’s very easy to test these.

They allow you to kind of use a view as feature in the desktop tool and you can impersonate the various groups and roles and it’s confirmed that the data is coming back the way you want.

So before you publish it out make sure you do some testing and use that capability to ensure that everything is working correctly.

So as I mentioned before, publishing and utilizing workspace apps has become kind of our definite go to when it comes to rolling out and end like end user report primarily because it has in association with the row level security they’ve created what is known as like audiences.

It’s an additional feature that allows you to segment and really slice the report so that only specific user subsets or groups are able to view them.

So their power BI’s done a great job in terms of providing a whole host of ways in which you can kind of secure and curate the, you know, end state or user reports so that only people can see.

Like row level security, you’re slicing the data based on the user audiences, it’s at the report level.

So you’re able to kind of present a single end URL report that looks like a single dashboard, but you’re able to hide or show different reports based on the security group or I guess silo or department.

However, you decide as an organization to kind of separate or slice up your end users and these settings.

This is only done once right when you create the workspace app.

So if you were to really roll out a single deployment pipeline, more times than not, you would only create a workspace app for the production app or production workspace.

So you wouldn’t necessarily have to create, duplicate, or replicate one for dev, test, and production.

That’s just an awful lot to maintain.

So that’s what we recommend.

It’s not mandatory, but that’s just our best practice recommendation.

Yeah, good point Steve.

Thank you.

Yeah the audience capabilities that they’ve added are really nice to and very user friendly as far as setting up to make sure that you know users are seeing the right slice of their data.

So if you haven’t looked at apps, I would recommend checking them out.

Last topic of the day is just some overall guidance on external internal sharing.

Again, this is a personal preference again, but there are lots of features in Power BI that are enabled by default that I believe should be disabled at least in the beginning.

You know, obviously there may be a need by need, you know, to enable these as they come up, but it’s much easier to, you know, make exceptions, as I said earlier, to one of these rules than to kind of and take it away from everybody after it’s already been rolled out.

So things like publishing to the web, embed codes, exporting things out of Power BI, guest user access, you know, business to business, letting guests share and collaborate, copying and pasting visuals, printing reports and dashboard, all these things are by default allowed out-of-the-box with Power BI.

You know, I’ve kind of harped on a couple topics over the last few slides, but you know, one of them being, you know, we want to kind of keep things in Power BI we don’t want to duplicate.

And all these things do is they kind of take stuff out and they’re static snapshots, they’re no longer In Sync.


And if people like to dump things to PDF and print things out, and that’s always been something people have done.

But I think it’s a better way to kind of take it away and wait for people to ask and make exceptions to specific user groups who need to do that vs just letting everyone do these things.

So just my two cents on these topics, I’m going to turn it back over to Steve to close out with some final thoughts.

So at the end of the day, this entire process, whenever you’re rolling out an environment or you know Power BI reporting infrastructure, this is not a one time event.

It’s really meant to be an iterative process and kind of a feedback loop.

So you always want to set it up and set expectations so that people know that you as an organization are always in a position to implement and roll out the monthly updates that Power BI does consistently.

And you guys are able to access and utilize all the newest, latest and greatest features.

So you really want to and you want to get feedback from the end users in terms of you know how they’re doing with it if you need to make some updates or changes.

And so it just thinking about it from a process standpoint rather than a one time event really helps.

So whether that’s setting up a shared document in terms of getting feedback or even utilizing the comments feature which is a little bit more advanced.

But that’s something that I recommend just in terms of making sure that it’s set up for the vast majority of your end users to get the most out of their Power BI experience.

SRP, you want to take it away.

Sure thing.

Thanks Steve.

Just a little bit of a wrap up stuff here before we get into the Q&A.

In terms of additional resources, we have tons and tons of free resources on our website.

We always encourage you to go there and check that out.

We’ve got tips and tricks, we’ve got blogs, we’ve got demos, we’ve got presentation decks and recordings of prior webinars.

Hundreds of free resources on our website at senturus.com/resources, so please check that out.

We also have a few upcoming events want to make you aware of next week.

We’ve got Microsoft Fabric and you, so if you’re interested in the latest and greatest about Microsoft Fabric, join us next Thursday.

We’ve also got a chat with Pat on publishing too, and using the Power BI service on October 18th and another chat with Pat for those of you who are hybrid Power BI and Cognos shops.

We’ve got a chat with Pat on Cognos 12 coming up in November and you can register for any or all of those at senturus.com/events.

A little bit about us as a company here at Senturus.

We provide a full spectrum of analytic services and enablement.

We also offer proprietary software to help accelerate bimodal BI and migrations.

We shine in hybrid BI environments.

We’ve been in this business for a long time.

No matter how big or how small your project, we can provide the flexibility and the knowledge to help you get the job done right.

As I said, we’ve been in this business for a long time.

We’ve been focused exclusively on business analytics for over 20 years.

Lots of clients, lots of projects, lots of experience.

We’ve got a team that’s big enough to meet your needs, but also small enough to provide personal attention.

So do reach out to us.

We’re always happy to help and eager to help you with your business analytics environments.

With that, we’re going to jump into a little bit of Q&A here.

I know Steve already answered some questions in the Q&A panel in real time.

So I’m just going to kind of jump around here and not really necessarily in the order that things were presented, but there are a couple of questions here about audiences.

Let me find my Wow.

A bunch of questions are coming in now.

So the first question that caught my eye was whether audiences are similar to the bursting capability and other tools like Cognos.

I don’t know if one of you can address that.

I’m way out of my league here with Power bi stuff, so let’s see if our experts here have anything to say on that.

So what was the question is whether audience is, are audiences in Power BI similar to the bursting capability and other reporting tools.

So Steve, I know you’re not a Cognos guy, so bursting in Cognos enables you to essentially send out like a customized report that’s using row level security based on like a group or a like a user role.

So that say like district managers get a bigger set of data in their sales report than you know like a I don’t know if the right words got it like a city manager for example.


So in a way, yes, you can set it up so that you know a specific role or position can see a certain number of reports.

But just to be clear, it’s not, similar in the sense that it’s setting up an outbound kind of e-mail report.

It’s just setting it up so that only specific role can see, you know, five out of the 10 reports in a workspace rather than all 10 reports.

And again, it’s only at the report level and not you can use it in combination with row level security, but the audiences are simply, you know, hiding or showing certain reports for specific groups that you kind of make up off the within the app itself.

Yeah, OK, so it’s more about report visibility and not about the exactly data that’s in those reports.

Yeah the bursting is going to give you almost like row level slices of reports delivered directly to your inbox usually or you know like a local file where this is going to be more of just a higher level of you know what pages of a dashboard do you see that relevant to your role.

So it’s a little bit similar but it doesn’t go as far and beyond as in as detailed as a like a burst would be cool.

All right.

We’re going to go on to another question Sabrina asks here.

Let me just share this out.

So what would you recommend for production workspace with many reports?

Should they all be used by the same audience or would you create a new audience based on the type of report?

That’s probably your arena, Steve, I think, yeah, so audiences again, for give an example.

So the production workspace you would create the audiences for maybe like the last example specific roles like account managers or even department level finance, marketing etc.

So that when a user logs in to the app or goes to view the app, if they are only supposed to see financial reports, you can hide every report except the financial reports and even an excel file.

And then as an admin or you know a maybe a different user such as marketing, they’re only able to see marketing reports and not see any financial related reports.

So it’s it really is just at the report level.

You’re not filtering the data going into the reports, but you’re just simply hitting the I show/hide icon at the report level.

I don’t know it.

That clarified things, but yeah, let’s take a look at some of these other questions here Kayla asked about with regard to sharing workspaces.

Is there a way to prevent sensitive data sets from being shared to unauthorized users?

Yes, So this is where you would actually utilize the endorsements/sensitivity, labels.

And in the settings of each data set, you’re able to kind of explicitly say whether or not you want the end users to be able to export the data, reshare the data set internally within your organization or just basically set it up to read only.

But that’s kind of the in between setting that you can set it up, configure it so that people, if you want them to see it, they can see it.

But if you only want them to read it and not actually share it externally or even within the organization, you can prevent that as well.

So that’s the kind of lock it down and prevent people from being able to export sensitive data even into Excel.

So that’s within the data set itself.

And I actually took out a slide.

There is capabilities in Power BI to integrate with Microsoft Purview, which has the ability to create custom labels like private confidential things like that, additional labels outside of the, you know, promoted or certified that I mentioned in the in the slide deck.

I didn’t want to get into the whole purview topic, but if that’s something you’re interested in as well, there are additional capabilities that you can use to do so.


Thanks guys.

Another question here from Sabrina.

Are imports using an On prem gateway still secure when you’re publishing to the web?

We’re using embed code.

I’m not sure I understand that’s part of that question.

But just in terms of security and I think this is also your camp, Steve, so are those.

So publishing to the web and using the embed code the sources when utilizing on premise gateway, it’s still secure, but you need to make sure that you are publishing to the web inside a secure portal if that’s what you want to do, or embed it in an existing organizational portal.

So the you just want to make sure that you’re not publishing it to a public forum essentially and it you don’t want to heavily rely simply on the security settings that are applied to an on premise gateway to do the security on the other end from an end user point of view.

So you just want to make sure that you’re not embedding it or publishing it to an unsecured portal at the end of the day.


Thanks Steve.

Let’s see here Brian asks, does Power BI have a subscription component where it can deliver content via e-mail.

So this is a little bit like how bursting is often used in Cognos.

So I think in the past there wasn’t a feature but there may be now.

So it’s you can there is a new component within Fabric that they just recently released known as Data act like Data Activator.

You can utilize that to set up a recurring e-mail report if you want.

It depends on how elaborate you want it to get, but the end users can actually in addition go in and hit subscribe at the top of in the toolbar at the top of the app.

That will allow them to receive kind of a data triggered alert emails either on a schedule or if a certain metric goes above or below a predefined number.

So there are multiple ways to do it, but at the end of the day without doing anything, the end users can still go in and hit subscribe.

So there are varying levels of complexity.

But yeah, the subscribe feature or button is probably the one of least resistance.


So nice of that capabilities in there.

Got a couple of questions here from Penny.

Quick short one to turn on sharing of data sets to disable some of the features, do you have to be in Office 365 or MS365 or Fabric or Power BI Admin?

That’s just so they all those things.


Power BI Admin is a you know a subset of the Power platform in Office 365.

So all three of them can do it, but all you need is the Power BI Fabric administrator capability to control almost all the settings I mentioned today are fabric administrator type permissions that you can do at that at that level.

Obviously you can do at the higher level as well as they have you know they kind of waterfall upwards.

But yeah, none of that is needed to do those things.

OK, so but the minimum, Todd, you said is Power BI or Fabric Administrator, correct role.

OK, all right.

Penny also says there seem to be many things that can only be done in Power BI Desktop.

Our data set in the Power BI Services import mode and is only refreshed daily.

Contributors cannot use Power Query Editor.

Is that because it’s using import mode?

That’s really they are doing their best to move everything that can be done in desktop to the online/power BI service experience.

But it’s really a permissions thing and so anything that you can do in desktop should just be you are correct.

And in terms of desktop only being import mode accessible or creating any import mode experience creating any new reports in the service, you actually are kind of capped or help to utilizing existing director quick direct query/live report connections of existing data sets.

So it’s kind of a yes and no answer, but to be clear, the import mode feature is only accessible via the desktop version.

Thank you guys.

A couple of questions about apps.

So Paul asks, is it limited to one app per workspace or can you have multiple apps in the workspace?

So workspace apps are held to a 1 to 1 workspace to app ratio.

So yeah, you can only create one app per workspace per workspace.

Yeah right.

And then Benjamin has an app related question also.

So he says we separate data sets and reports in different workspaces.

Do you still recommend apps to control access to the data sets?

I do.

I think if you if you’re utilizing workspaces to segment the data sets, I think it’s still beneficial to maybe even either replicate or create a new front end report within a different workspace that you’re only going to use for the kind of the end users.

So that you’re able to still use the segmentation and workspaces for different data.

But at the end of the day you’re still able to use a workspace to create the end user reports and kind of use those as the back end and the workspace as kind of the front end.

So you’re able to leverage both the benefits that come with well sorry small asterisk that cross workspace utilization in terms of data sets can only be done via the premium per user or the premium capacity licenses just as an FYI.

So I recommend that if you have a premium per user or a premium capacity license.

If not, then you’re correct.

I wouldn’t necessarily recommend doing all of that work if you have pro licenses.

Sorry for the confusion.

That was a good question.

Thanks, Steve.

Brian has a question here that’s near and dear to the hearts of many administrators.

As for the periodic auditing component, are there reports or queries against Power BI metadata that provide insight into how the environment is actually being revised?

Is any of that readily available to admins?

The metadata of Sorry, yeah I think what Brian is asking, is there a way to see what the like relative utilization of different pieces of content are in your Power BI environment?

Why can you tell?

Oh, this query has been executed 100 times in the last month or so for auditing purposes.

There is a like a workspace utilization report that I highly recommend using.

It’s called like a user utilization report and it actually gives you like end user row level transparency into which users are opening up and viewing certain pages of reports, certain reports.

If there’s a bottleneck in terms of a long running query that’s kind of bottlenecking the entire workspace or capacity, you’re able to get some insight into that.

But and like I mentioned at the end in terms of it being a ongoing iterative process and not a one time event, I think utilizing that user metrics report at the workspace level is very, very important and I think it’s best practice to do that.

Thanks, Steve.

All right, we’ve got more and more questions coming here.

So another question from Brian, can you nest workspaces inside of one another to help with organization and or is that on the road map?

I know you sometimes have insight into what’s coming Steve on the Microsoft side, but can you nest workspaces today and or do you know if that may be possible in the future.

So there’s nothing on the road map specifically about nesting the workspaces right now.

But like I had mentioned a little bit earlier, if you have premium per user or premium capacity licenses, you are able to kind of set up that nesting functionality and cross workspace utilization of data sets like outside of one single workspace.

And so you can kind of do it yourself, but if you’re thinking about rearchitecting an existing workspace or infrastructure, that’s something that I can have a separate conversation about if you’d like.


Doug asks kind of a follow on question to the your earlier comments about auditing, Steve.

He says when you answered the question about usage auditing, it seemed to pertain to the single workspace and is it possible to see that kind of data for an entire tenant or is that workspace specific.

So it’s workspace specific in terms of the utilization report that I was referring to previously.

But there is also they need to release a newer version of it.

But if you go into the admin portal of your capacity/tenant, there’s a, an embedded dashboard that actually shows you similar metrics in terms of utilization and the metadata being used at the user level.

So I think just using both of them in combination will give you very good insight into you know which resources are being used, which are you know overextended, which ones are bottlenecked etc.

So I know they are actually working on an updated version of the embedded like capacity/tenant level utilization report though.


Thanks, Steve.

Let’s see what else.

We’re getting low on time here, but we’ll get a couple more questions now.

And I’m going to take another Doug question here.

I feel like we’re playing stump the expert with you, Steve.

So far you’re doing great.

So we’ve got, you mentioned using certified data sets.

It looks like the only way to allow a user to use a data set to develop a report in their workspace is to make them a member of the workspace that contains the actual data set.

How then would you keep them from editing content in that space?

So if you want them to actually be able to have the freedom to create those reports along with the cross workspace data sets, you’re kind of in a position where you need to make them an admin in order to work really work with those data sets as needed.

But at the same time I would utilize or really map out the roles, access and really sensitivity labels so that you can set up row level security specifically for them if necessary.

So there are a lot of, you know, pulleys and levers that you can kind of set up to make sure that people are only seeing what they should be.

But in terms of the role, there isn’t, you know, a role specific like member, contributor or admin that will give you exactly what you’re thinking of, right?

Like right out of the gate.

So you kind of need to really work through what people or certain people should have access to what you want them to do.

And then you need to kind of tweak all of the row level security, the object level security, audiences and workspace level permissions so that they can only see what you want them to see, but also they can so that they can build what you want them to build.

So it is possible, it’s just you need to really clarify and understand what your requirements are for that user.

And that can also kind of play into deployment pipelines too.

You know, you could have a higher level of being able to publish and create content in a development folder that’s not visible to end users and go through the testing process of promoting it upwards when it’s, you know, verified.

But yeah, there’s a lot of kind of complexity to that type of question.

Good question.

All right, thanks guys.

So we’re at the top of the hour.

So I’m going to going to wrap things up here.

If we didn’t get to the question that you posted in the Q&A panel, we’ll follow up with you after the webinar.

We typically will also post a kind of a collection of the answers to the things we addressed here.

So you can keep an eye out for that.

And with that, I want to thank Todd and Steve for being here today and presenting and answering lots of great questions.

And thank you all for joining us.

It’s always great to have you here on our webinars.

You can always reach out to us.

You can find us at senturus.com, you can e-mail us [email protected].

And Todd, if you just jump to the next slide, just so people have that info, You can even call us if you feel like picking up the phone.

We’ve got an 800 number.

Always happy to chat with you also.

Kay Knowles has been posting her calendar link into the chat.

So if you don’t see that, you can go ahead and click on that link If you’d like to talk to Kay and or Todd or Steve, we can get some time on the calendar to talk about your Power BI and governance needs.

So with that, thank you all for joining us today and we hope to see you again on a future Senturus webinar, Todd, Steve, thank you again.

Thanks guys.

Questions log

Q: Do the same or similar administrative roles exist when using Power BI Report Server?
A: Power BI Report Server administrative capabilities:

  1. Manages on-premises server infrastructure and security: This includes server installation, configuration and ensuring the security of the on-premises environment.
  2. Handles on-premises data source management: Administrators control and maintain data sources located within the organization’s on-premises network.
  3. Publishes, organizes and maintains on-premises reports: Report authors publish and manage reports exclusively within the on-premises server.
  4. Controls on-premises data access and security: Security administrators define and enforce access controls for reports and data stored locally.

Power BI cloud service (Power BI service) administrative capabilities:

  1.  Manages cloud-based environment, user access and licensing: Service administrators oversee the entire cloud-based Power BI environment, including user management and licensing.
  2. Administers specific cloud workspaces: Workspace administrators have control over individual workspaces within the cloud service, managing content and permissions.
  3. Creates, publishes and maintains cloud-based reports and dashboards: Content creators design, publish and update reports and dashboards directly in the cloud service.
  4. Configures role-level security and data source access in the cloud: Security administrators set up role-level security and manage access to cloud-hosted datasets and reports.

Power BI Report Server handles on-premises server and data management while Power BI Cloud service offers cloud-based administrative capabilities for content creation, security and collaboration.

Q: Can we take advantage of pipelines, if we do the majority of our work in Power BI Desktop?
A: Yes. The pipelines should be the only workspace that is ever published from Desktop directly. The permission to publish directly should be disabled for the test and production workspaces.

Q: What is the best way to set up dev, QA and prod (which are three workspaces or three separate tenants) in Power BI?
A: Setting up environments for development (dev), quality assurance (QA) and production (prod) in Power BI is a common practice to ensure a smooth development and deployment process. Both the three workspaces and the three separate tenants approaches have their own advantages and disadvantages. Here’s a comparison to help you decide:

  1. Three Workspaces (dev, QA, prod):


  1. Simplicity: All work is done within the same tenant, making it easier to manage and navigate.
  2. Cost-effective: No need for multiple Power BI Pro licenses across different tenants. 
  3. Easy promotion: Moving content from one workspace to another can be done quickly. 
  4. Unified security: All security settings and user permissions are managed within a single tenant. 


  • Risk of overwrite: There’s a risk of accidentally publishing a report to the wrong workspace. 
  • Shared capacity: If you’re using Power BI Premium, all workspaces will share the same capacity.  

Q: How does Power BI secure the database access by using dataflows as opposed to direct access to create datasets?
A: 1. Power BI dataflows:
Dataflows are a form of ETL within the Power BI service that allows you to connect to, transform and load data into the Power BI environment.

    • Managed environment: Dataflows are executed within the managed Power BI service environment. This means Microsoft takes care of the underlying infrastructure, ensuring it’s secure and compliant.
    • Stored in Power BI: After the data is ingested using dataflows, it’s stored in Azure Data Lake Storage Gen2, which provides enterprise-level security features, including encryption at rest.
    • Access control: With dataflows, you can set up more granular access controls. You can determine who can create, edit or view a dataflow, independent of the datasets that are created from that dataflow.
    • Scheduled refresh: Dataflows support scheduled refreshes. This means you can set them up to pull in new data at specific intervals, reducing the need for frequent direct connections to the source systems.
    • Gateway: If you’re connecting to on-premises data sources, you’d use the on-premises data gateway. This gateway ensures encrypted and secure data transfer from on-premises sources to Power BI.
  1. Direct Access to create datasets:
    When you directly connect to a data source to create a dataset, you’re often connecting in real-time or near-real-time to the data source.

    • Live connections: Some sources, like SQL Server Analysis Services support live connections. This means that the data stays in the original source and Power BI queries it in real-time. This can be secure if your source system has robust security mechanisms in place.
    • Direct Query: For some relational databases, Power BI supports DirectQuery mode. In this mode, data isn’t imported into Power BI; instead, Power BI sends queries to the data source when a report is viewed.
    • Gateway: Just like with dataflows, if you’re connecting to on-premises data sources directly, you’d use the on-premises data gateway to ensure encrypted and secure data transfer.
    • Stored credentials: For scheduled refresh scenarios, Power BI needs to store credentials to access the data source. These credentials are encrypted and stored securely.
  2. Comparison:
    • Performance: Dataflows might offer better performance for large datasets since the data is pre-processed and stored in an optimized format in Azure Data Lake Storage Gen2.
    • Flexibility: Direct access might be more suitable for real-time analytics needs or when the data source’s native security and business logic (like in SQL Server Analysis Services) need to be used directly.
    • Security granularity: Dataflows offer an additional layer of security granularity since you can control access to the dataflow separately from the datasets and reports that derive from it.

Both methods have their strengths. The choice of which you use often depends on the needs of the project. It’s also worth noting that security in Power BI isn’t just about how data is ingested; it also involves features like row-level security, auditing and compliance tools that Microsoft provides within the Power BI service. 

Q: How do we refresh the Power BI dataset based on database trigger?
A: Refreshing a Power BI dataset based on a database trigger isn’t supported out of the box. However, you can achieve this by combining several components and services. Here’s a high-level process description using Azure services:


  1. Database trigger:
    • Have a trigger set up in your database (like SQL Azure) that detects changes (inserts, updates, deletes, etc.).
    • When the trigger fires, it can write a message to an Azure Service Bus or insert a record in an Azure Queue storage.
  2. Azure Functions:
    • Set up Azure Functions so you can get triggered based on the message in the Azure Service Bus or the Azure Queue storage.
    • Azure Functions call the Power BI API to initiate a dataset refresh.
  3. Power BI API:
    • To programmatically refresh a dataset in Power BI, you’ll use “Refresh Dataset” endpoint in the the Power BI REST API.
    • You’ll need to have an access token (usually obtained via Azure AD) to authenticate and make calls to the Power BI API.
  4. Power BI service:
    • After the API call is received, the Power BI service will initiate the dataset refresh as requested.

Detailed Steps:

  1. Database trigger:
    • In your SQL database, create a trigger that responds to data changes.
    • Use this trigger to send a message to Azure Service Bus or insert into Azure Queue storage.
  2. Azure Service Bus or Azure Queue storage:
    • Set up either service based on your preference. This will serve as the intermediary to notify the Azure Functions.
  3. Azure Functions:
    • Create a new function that gets triggered by the Azure Service Bus or Azure Queue storage.
    • In this function, write code to call the Power BI API. You’ll need the datasetId of your Power BI dataset and an access token for authentication.

Here’s a pseudo-code for Azure Functions:

def azure_function_triggered_by_service_bus(): 

    dataset_id = "YOUR_DATASET_ID" 

    access_token = get_power_bi_access_token()  # Implement this function to get token from Azure AD 

      headers = { 

        "Authorization": f"Bearer {access_token}", 

        "Content-Type": "application/json" 


        url = f"https://api.powerbi.com/v1.0/myorg/datasets/{dataset_id}/refreshes" 

    response = requests.post(url, headers=headers) 

        if response.status_code == 202: 

        print("Refresh initiated successfully.") 


        print("Failed to initiate refresh:", response.text) 
  1. Power BI service:
    • Ensure that your dataset in the Power BI service is set up correctly to allow API-based refreshes.
    • Monitor refreshes in the Power BI service to see if they’re being triggered as expected.

Remember to handle errors and exceptions properly in your Azure Functions to account for any issues that might arise during the process. 

This setup allows for a near-real-time refresh of Power BI datasets based on changes in the database. It’s essential to be mindful of the refresh limits imposed by Power BI service, especially if you’re using Power BI Pro or shared capacities in Power BI Premium. 

Q: You mentioned using certified Power BI datasets. It looks like the only way to allow users to use a dataset to develop a report in their workspace is to make them a member of the workspace that contains the dataset. How do we keep them from editing content in that workspace?
A: If you want users to have the freedom to create those reports along with the cross-workspace data sets, you may need to make them an admin to really work with those datasets as needed. It’s important to map out the roles, access and sensitivity labels so you can set up row-level security for them if necessary. There are a lot of access functions you can set up to make sure users are only seeing what they should be. But in terms of the role, there isn’t a specific role like member, contributor or admin that will give you exactly what you’re asking for. So you need to work through what people should have access to based on what you want them to do. Then you need to tweak all of the row-level security and object-level security audiences and workspace level permissions so they can only see what they are authorized to. While it is possible, you need to clarify and understand what your requirements are for each user. 

Q: Here is our scenario. If we create a Power BI dataset to build a new report and publish the report to a new workspace and then an app is created using the new report and dataset and the app is shared with a user, will the user need to have access to the original dataset to view the content in the app?
A: No, the user does not need direct access to the original dataset to view the content in the app. When you create an app in Power BI and share it with users, those users get access to the reports, dashboards and underlying datasets that are part of the app. This is one of the primary advantages of using apps in Power BI, to package content and share it without requiring users to have access to the original workspace or individual datasets.

Here’s a breakdown:

  1. Workspace: This is where you, as a content creator or developer, create, modify and manage Power BI content. Only members of the workspace need direct access to its content, including datasets.
  2. App: When you publish an app from a workspace, you’re packaging up the content (reports, dashboards, datasets) into a shareable unit. This app can then be shared with end-users.
  3. End users: When you share the app with end users, they can install and view the app’s content without needing access to the original workspace or any of its datasets. They will see the data in the reports and dashboards, but they won’t be able to modify the original content or see the workspace where the content was developed.
  4. Permissions: The permissions for viewing data and interacting with reports/dashboards in the app are based on the app’s settings and row-level security you’ve set up on the dataset. End users don’t need direct permissions on the original dataset; they inherit the necessary permissions through the app.

After you share an app with a user, they can view its content without any additional permissions on the original dataset or workspace. 

Q: Can we use security to allow Power BI users to open in Excel with Open in Desktop App yet still enforce not saving copies of the sheets locally? This would be to satisfy traditional local Excel users but encourage central storage and sharing of worksheets.
A: Power BI integrates with Excel through the Analyze in Excel and Export features. If you want to allow users to interact with Power BI datasets using Excel but prevent them from saving the data locally, you will need to consider a combination of Power BI and Excel/Office 365 settings. 

Here’s a breakdown of what can and can’t be done: 

  1. Power BI settings:
    • Analyze in Excel: When enabled, this feature allows users to create pivot tables and reports in Excel that directly connect to the Power BI dataset. The data remains in the Power BI service and Excel pulls the data in real-time.
    • Export data: You can restrict users from exporting data from Power BI. This will prevent users from exporting summarized data or detailed data from visuals.
  2. Excel/Office 365 settings:
    • Information Rights Management (IRM): With IRM, you can restrict permissions to Excel files. You can allow users to view files but not print, forward or save them. However, this requires Azure Rights Management (part of Azure Information Protection) to be set up and integrated with Office 365.
    • Data Loss Prevention (DLP): Office 365 has DLP policies that can prevent sensitive data from being saved to unauthorized locations. This can be useful if you have data classifications that shouldn’t be saved outside of approved locations.
  3. Limitations:
    • If users can interact with data in Excel, they can still potentially copy data manually, even if you prevent them from saving the Excel file.
    • Restricting users from saving can be cumbersome and might affect the user experience. Ensure that users understand why such restrictions are in place.
  4. User training and culture:
    • Technical solutions are just one part of the puzzle. It’s equally important to train users and cultivate a culture of central storage and sharing. Explain the benefits of centralized data management, the risks of local data copies (like outdated data or data breaches) and encourage best practices.

While you can use a combination of Power BI and Office 365 settings to restrict users from saving Excel files locally, it’s not foolproof. It’s essential to complement technical measures with user training and awareness to achieve the desired outcome.

Q: Can we nest Power BI workspaces inside one another to help with organization or is that on the roadmap for Microsoft to include that functionality?
A: There’s nothing on Microsoft’s roadmap about nesting workspaces right now.
But if you have premium per user or premium capacity licenses, you are able to set up that nesting functionality and cross-workspace utilization of data sets outside of one single workspace. However, if you’re thinking about rearchitecting an existing workspace or infrastructure, that’s something that we can have a separate conversation about, contact us. 

Q: How can I see usage auditing data for my entire tenant?
A: Go into the admin portal of your capacity/tenant, there’s an embedded dashboard that shows you metrics of utilization and the metadata being used at the user level. Using them in combination with individual user data will give you good insight into which resources are being used, which are being overextended and which ones are bottlenecked. Microsoft is working on an updated version of the embedded capacity/tenant level utilization report. 

Q: I want to create a Power BI certified dataset that provides access to a data mart that has no sensitive data. I want any user to be able to create their own report. I don’t want any user to be able to edit content in the workspace that contains the dataset. How is this done?
A: While this is a little tricky, this can certainly be accomplished by following these step-by-step instructions:

  1. Create the dataset:
    1. In your Power BI Desktop, connect to your data mart and build your dataset.
    2. Publish the dataset to a dedicated workspace in Power BI service. This workspace will hold the certified dataset.
  2. Certify the dataset:
    Before a dataset can be certified, it needs to be promoted. Only datasets in workspaces in Premium capacity or dedicated cloud capacity can be certified.

    1. In the Power BI service, go to the workspace where you published the dataset.
    2. Click more options (the three dots) next to the dataset, then choose Settings.
    3. In the Endorsement section, first promote the dataset. After promotion, you’ll have the option to certify the dataset. (Only Power BI admins can certify datasets.)
  3. Adjust Workspace permissions:
    To ensure that users can’t edit content in the workspace but can use the dataset, you’ll set permissions accordingly:

    1. In the Power BI service, navigate to the workspace containing your dataset.
    2. Click Access in the workspace menu.
    3. Add users or security groups and assign them the Viewer role. The Viewer role allows users to see content in the workspace but not edit or add anything. They can also connect to and create reports from datasets in the workspace.
  4. Allow users to create their own reports:
    1. After the dataset is certified, users across the organization can discover it in the Datasets hub in Power BI service.
    2. Users can connect to this dataset and create their own reports either directly in Power BI service by choosing Get Data > Power BI datasets or in Power BI Desktop by selecting Get Data > Power Platform > Power BI datasets.
    3. When users create their own reports using this dataset, they’ll be doing so in their My Workspace or any other workspace they have permissions to edit. They won’t be creating or modifying content in the workspace that houses the certified dataset.

By following these steps, you’ll have a certified dataset in a controlled workspace, while still allowing users the flexibility to create their own reports using that dataset.

Connect with Senturus

Sign up to be notified about our upcoming events

Back to top