<< VIEW FULL RESOURCE LIBRARY

How to Successfully Implement Self-Service Analytics

September 24, 2020

Agile, governed self-service BI with a Focus on Cognos Analytics

Learn what it takes to achieve the powerful reality of agile, governed self-service analytics with any BI tool, and with Cognos Analytics specifically. Expensive self-service BI implementations often serve as nothing more than a simple data extract tool that eventually feeds downstream processes into Excel. Whether you’re running Cognos, Power BI, Tableau or a combination, watch this on-demand webinar to get valuable information for achieving well-adopted self-service that delivers exceptional ROI.

Topics we address about self-service analytics

  • What to expect if self-service analytics lives up to its promise
  • Best practices for achieving the goal of self-service
  • Self-service requirements: discovery vs. gathering
  • The value of the semantic layer/business logic to enabling self-service
  • Pros and cons of the three main types of Cognos architecture
    • IT-driven enterprise model
    • End-user driven model
    • Hybrid model
  • Cognos self-service components
▼ PRESENTER

Pedro Ining
Principle BI Analytics Architect
Senturus, Inc.

Pedro joined Senturus in 2010 and brings over 20 years of BI and data warehousing experience to his role. He’s been instrumental in implementing data warehousing systems from scratch and has experienced the evolution of the BI industry through several iterations of BI products including Cognos, MicroStrategy and Tableau.

▼ TECHNOLOGIES

Cognos Analytics
Cognos Data Modules
Cognos Framework Manager

▼ MACHINE TRANSCRIPT

0:06
Greetings and welcome to this latest installment of the Senturus’s knowledge series. Today, we’re excited to be presenting to you about the topic of how to successfully implement self-service analytics.

0:20
Before we get into the presentation today, a few housekeeping items. You can use the GoToWebinar control panel to help make the session interactive and while we have everyone’s microphones muted out of courtesy to our presenter, where we encourage you to enter questions through the question pane in the control panel. 

0:40
And, while we’re generally able to respond to your questions and do try to, while the webinars and progress generally at the end of the webinar, if for some reason, we’re unable to reply to the question in the webinar, will cover it in a written response document that we will post on  senturus.com.

0:57
And speaking of Senturus.com, head on over there to obtain today’s presentation.

1:02
We always get that question early and often throughout the webinar here. And so, you can go get it at senturus.com slash resources or you can go over senturus.com, and select the Resources tab, and look at the Resources Library. Alternatively, the link has been placed in the GoToWebinar control panel, and you can access it there while you’re there. Be sure to bookmark it, as it has tons of valuable content in addressing a wide variety of business analytics topics.

1:31
Some introductions. Today, I’m pleased to be joined by my esteemed colleague, Pedro….

1:36
Pedro has been with Senturus for a decade now and brings over 20 years of BI and data warehousing experience to his role here.

1:44
He’s been instrumental in implementing data warehousing systems from scratch and has experienced the evolution of the BI industry through several iterations of BI products, including Cognos, Micro Strategy, and Tableau.

1:57
And he’s a newly minted MCS a Microsoft stack, so congratulations on that, Pedro. We have a quick poll here.

2:07
We always like to get an idea or get some input from our presenters here and are, from our source, from our audience. And, today’s poll is, what percentage of your Cognos platform is used solely for traditional canned reporting? Pretty obvious choices here, estimate to the best of your abilities 25, 50, 75, 100%, or don’t have Cognos Or, have absolutely. No idea.

2:34
So, go ahead and get your answers in there and we’ll give you should give people a few seconds to kind of mull this over and get their answers in.

2:44
Pretty easy questions. We’ve got about two thirds of you responding so far.

2:48
Give it a few more seconds.

2:53
Seems to flatten out here. So, I’ll close it out and share it. So, about 75% are using it solely for traditional reporting. That’s a, that’s a, or, sorry, 45% are using it certain percent, or traditional camera reporting. This is a tough one to actually read back to people. But then you can see another 20% or so is it for about half for that? And, you know, 15% or so about for solely for that, and then 9%, only, 25%. Interesting. So thank you very much for your insights there. And with that, we’re going to get into the core of the presentation. So I will hand the microphone on the floor over to Pedro. Pedro, take it away.

3:39
Thank you, Mike. That was an interesting poll where a majority of existing Cognos implementations are still kind of use for Ken reporting. So, we’ll weave that in. But, we’re going to talk about today, self-service BI. And, it seemed, like, from the dawn of BI, when we started talking about information, management and things like that, way back in the nineties.

4:04
Sometime around that time, self-service has always been, kind of, a goal, and in the last few years, it’s been an explosive growth in new modern BI tools.

4:15
We have some of them over here, click Power BI, Tableau.

4:19
And, of course, there’s IBM Cognos analytics, and Caicos analytics has been kind of in a unique spot where they are they’ve been a legacy for a while.

4:27
They’ve been around for a long time, and they’re not necessarily associated, maybe, with, suppose, a modern use of self-service BI, lack, Tableau, and Power BI. We’re going to talk about that later in the presentation.

4:39
But we’re going to talk about self-service BI first. We’re gonna kind of pull ourselves away from the technology a little bit.

4:46
Let’s talk about, what is self-service BI? I’m sure a lot of you out there can come up with a definition, and there’s probably a lot of them, a lot of definitions out there. I’m gonna pull one out here on the screen, and we can kinda drill into this.

4:59
So I have a definition Herr, self-service BI allows business users to access more data sources on their own model, their own data.

5:11
When create reports or dashboard visualizations with very little help from IT.

5:19
Goal, one of the promises is it could lead to faster and more Agile Data Analytics as compared with traditional BI development.

5:26
So if you take that first part of the definition out there, Traditional BI development, allowing users to access more data on their own model, their own data, that was pretty much something that a lot of tools didn’t really want to do.

5:40
It, was more of an IT report factory, or meet with you, will model the data for you. Wait a few months, we’ll get back to you.

5:49
So, the modern tools have kind of flip that script the bit and say, you know, we’ll, we’ll give you the tool, and you can have everything on your own. Do it your way.

5:59
OK, that, supposedly, is going to lead to that faster and more agile the data analytics environment, compared with the old method so that promise of self-service VI.

6:11
It’s basically saying, yeah, let’s get the business. Users direct access to data, and then, as long as we give them access to data, we get an extra … will get all our report Burden of MIT, we don’t do anything anymore.

6:23
Get access to tables. Give them access to data warehouse.

6:26
Yeah. That’s the promise.

6:28
Then we’ll have better decision making. It will happen more quickly, scan. We want it to drive business growth. Ultimately, all BI systems, Madame Time, wants us to be able to drive business growth more quickly. So that we can respond out there. And with the advent of big data, that’s even more important.

6:47
And another promise is that these new modern BI tools, they’re gonna give you more visual, more insightful, more automated analysis.

6:56
You know, the users themselves can create and iterate on their own analytic reports and dashboards, not IT. But then, we have to ask a question.

7:08
Then, why do so many self-service BI efforts go wrong? How many of you out there have been and suppose it’s self-service implementations, how many of you have gone out there? And bought 200 licenses of Tableau. People install it. They don’t really use it. Maybe, maybe they use it just for a little bit. How many of you out there have tried to put Cognos out there? And it’s just becomes more of an extract tool than anything else versus self-service BI? Probably the ultimate self-service BI tools Excel still, even to this day.

7:39
So what are the issues? What are the communist misconceptions when it comes to self-service BI?

7:45
How about this one? Saw IT is gonna buy the tool.

7:49
We’re gonna installed it, know, we’re going to go build it up, and from a platform perspective, we’re done.

7:55
End users say that we’re going to make that data available, and then give it to the users. But, inevitably, happens is, users will get to a certain point.

8:06
You’re going to hit those roadblocks.

8:09
And they’ve had minimal training, maybe.

8:11
And the modeling, the semantic layer of that, of the tool, only serves up the basic raw data to the point where these users throw up their hands and they can’t use it, OK?

8:21
Another scenario where BI tools are in place is, as an extract, till, do the rest of it in Excel, Another misconception eliminates the need for IT or at least minimizes the need for IT.

8:35
But, ultimately, the complex data, there’s complex data’s out there and it’s still needs to be modeled. No snippy loading up Power BI or Tableau in, Cognos also Cognos and pointing it to a data source doesn’t remove the need for, maybe, IT. Or an administrator?

8:51
Or somebody working with the business to help model that data in a way that, the BI tool can be as a doorway act, like a doorway, to access that IT control data, or a nice semantic layer?

9:07
Now, there are, maybe, some users out there who know that backend data, very well. That could be probably in the minority, in a lot of people out there, will probably be fine with that majority of users. However, they’re going to need a properly model metadata layer, to make this stuff work.

9:26
Another misconception.

9:28
Got a brand new BI tool. It’s going to be a successful project. And spending a lot of money on that.

9:33
Ultimately, though, again, your foundational data layer, your foundational, data governments’ your foundational business processes around self-service analytics have to be strong in order for that tool to work.

9:48
And another common misconception.

9:51
Users will automatically understand how to use that tool. Change management user training are critical. There’s a lot of very smart people out there.

10:00
They get to a certain point, but they’re always going to need help, And ultimately, because we’re going right back to the data as a data issue. So these are some common misconceptions.

10:09
So, how do we focus?

10:12
Why did we focus our efforts to help with implementing BI, self-service platform? Here are some ideas that we can, we’re going to walk through. We’re going to walk through some best practices for this area.

10:29
We’re going to talk a little bit more about requirements discovery instead of gathering and we’re gonna focus a little more on Cognos self-service architectures: This is where we will get into a little bit of the tool is more of a slanted towards Cognos, but I think you could even actually extrapolate some of those to whatever tool you’re thinking about implementing.

10:50
And maybe you can give us a second thought if you’re thinking about moving away from Cognos to other tools because there’s a lot of things have changed that area.

11:01
OK, so let’s talk a little bit, first, about some best practices for self-service BI. I’m sure there are a lot more other people have thought about other things. But let’s let’s talk about this first.

11:11
Um, data governments’ now continued use of a very good data governance process for existing data warehouses And sources is very important. Data warehouses are not going to go away simply because users can just take that tool and drop it to the source of the data.

11:29
In fact, having a user install a tool and pointing it to your ERP systems, customers set of tables is probably a recipe for disaster.

11:39
Data warehouses have over the years matured quite a bit to the point where all the cleansing, all the QA, people have signed off on, things like, this is our, our govern customer dimension. These are our products. These are what you want to look at. It. they’re going to rely on the curation of key enterprise data, is going to be a critical component of continued data quality.

12:03
So, when you start implementing self-service BI and start installing these tools, having data governments and having a catalog, and having those users understand where the proper data sources are is very important.

12:15
Now, we’ve done another webinars and Tour Assist. Another webinar on this topic was recently showed. It’s on our website, I would advise you to take a look at that, is called, Why Bother with Data Governance? Suggest you take a look at that because that’s going to be a very important piece of your self-service implementation.

12:33
Requirements, Discovery, We’ll kind of get, we’ll talk about a little more detail, but no gallach requirements gathering techniques, that go beyond the typical, What do you need? What kind of reports do you need?

12:47
You know, we need it, we need to transform that to more of a discovery process.

12:52
Get more insights from your meetings, allow you to create better self-service.

12:57
We’ll drill on that in a minute here.

12:59
Semantic layers, Semantic metadata, the data layer that does not overburden the users. A lot of efforts needed to be made in that area that’s not going to go away. And again, just dropping the tool on a bunch of tables. For users who don’t know what those tables do, I mean, how to join them is gonna be, is gonna kinda lead to not a very successful implementation.

13:24
So, but, at the same time, simply serving up a very nice metadata layer, which just has dimensions, and maybe a core metric. Like sales.

13:33
It, it doesn’t get to the next to the next step, because we, we end up, in requirement sessions.

13:39
Do not need any say, I need to see Sales by Customer, product, by location. You do that, you think you’re done. But, we don’t know that the users need to do other things, like, kind of do a year to date year over year comparison, OK.

13:51
They bring up that semantic layer on a nice tool, and they could do sales by the current year, but how do I compared to last year?

13:59
I don’t know how to do that, I’m gonna go ahead and extract the data and put it out to Excel, and I’m gonna do it that way, OK, so, they throw up their hands and they give up, and they do? What’s more comfortable?

14:12
So, and also, another aspect of the Semantic layer.

14:17
We need to have the BI to allow users to modify and augment data with local data And, all modern BI tools do this now, and Cognos does this making sure they understand how to do this because lot of data sources will not have everything. All your ERP system, as your databases, your data warehouse.

14:37
Houses have a lot of data, but there could be things that users have on their desktop, spreadsheets that have demographic data that they’ve downloaded from another site, geographic information, census data. Things that you can’t put in your data warehouses really quickly.

14:51
And they need to have the agile capability to be able to put that and integrate that into the semantic layer, you know, on their own.

14:58
And not wait for IT to do that capability training. Very important. That’s critical. Blanket training approach.

15:06
By just putting out a bunch of, uh, videos are, or how to do report writing, may not necessarily satisfy the needs. We’re at a client right now, we’re doing a lot of coaching on co-development sessions.

15:19
It really is an aspect of the BI training, and also, what are their issues with the existing analytical process? So there’s a technical side, and an understanding, and, and kinda helping them shift your thinking, culturally, shifting the thinking, from just telling IT what kind of reports they need to kind of talking about, what is it, what the goal is, and we’ll talk about that, too. Performance tuning is a very important.

15:47
Obviously, if your system is not up to snuff, and people take a long time to run reports, they’re going to think of it as a failure.

15:58
So, we have to have to make sure that IT has the infrastructure, has the platform capabilities to monitor the utilization of BI.

16:05
All the artifacts, we have to have a very good usage analytics system to analyze who’s using what, who’s not using it, So, we’re always focused on who’s using the system, but who’s not using the system.

16:18
And you’ll find out, maybe, those key business sponsors are not using the system, and you’ll see that you could, you could intercept that particular issue, and interview those folks to understand, what the reasons why. Now, that last bullet there, recognized not all areas are suited for self-service seems kind of anti what we’re trying to do.

16:36
But you have to recognize that there are going to be areas in your process and your business, where people still need professionally built IT reports. For things like, maybe regulatory filings, very complicated reports, they have to be in a certain format. So a percentage of your system, if you want to think of it is still a little bit of a report factory.

16:59
That’s always going to kind of be there professionally. Develop dashboards, espouse. It has to go out. It has to look specific a certain way.

17:06
So go into yourself. So service implementation, and not think that everything is going to be 100% slur self-service. But there’s going to be other areas that are not suited. Just realize that.

17:18
So requirements. Let’s talk about that a little bit.

17:22
Discovery versus gathering, we always talk about requirements gathering. You know, in the aspect of, we end up at our user sessions. And we’re the BI professionals. And the hairs are business.

17:36
They’re waiting for us to go ahead and interview them, and they’re anxious about a BI system that’ll help us help them satisfy their needs.

17:42
But we always then, tend to gather requirements sense of what kind of reports duty. Can you give me a samples of those reports? What kind of measures you need? Where’s the data that you want me to get it to? The duties from dashboards, you know, so we ended up that. Then, we kinda get into aspic on what are the tables?

18:00
This leads us to us, as a BI professional, into a more comfortable discussion of tables, fields, and reports, and it’s really, basically, stemming from the base question, what do you want, or what do you want me to do?

18:16
So, we need to morph our requirements, gathering tech, sneaks into something that will better understand the client’s goals.

18:25
You know, we need to be able to understand a client’s business process or workflow.

18:32
We enabled to discuss existing pain points.

18:37
Then, as a part of this, we can consolidate this with a user story. This is a tool in a sense to try to put all this together and not talk so much about the BI tool of the mutation itself.

18:50
So determining an understanding of Cloud SQL, if we talk about this first, this discussion is independent of the BI tools feature.

19:00
Oftentimes, the end users or users girl is often in lockstep with their job responsibilities.

19:08
For example, you might have a discussion with a C level executive. He wants to see a current snapshot of FTE headcount, for example, how compares the last year, what areas are growing a staffing manager wants to see headcount by company location in which is experiencing the most growth.

19:28
These are goals, they’re not talking about fields and reports, they’re out there telling you what their goals are, your director.

19:36
We’d like to analyzer also the current workforce, but they wanted to see a breakout of exempt versus nonexempt.

19:41
Maybe discover what parts of the organization are experiencing the most attrition.

19:49
So those are some of the goals to the right where we’re trying to get to, but then you start drilling in even more.

19:55
And we start thinking about, you know, a user’s business process and workflow.

20:00
There’s a workflow that usually starts with some sort of triggering event from the left.

20:06
What prompts the user to seek more information to seek a goal?

20:12
Um, that could be anything. It could be, maybe it’s a scheduled regulatory filing, and he has to start doing the filing, and he has a certain set of tasks to do that. There’s a metric that comes out of line. He gets a phone call from his management team.

20:25
And he has a series of tasks that he has to do.

20:29
He has to maybe running five reports. You know, he needs met run five reports or mummy. Maybe some of those reports are missing. He goes to you because he’s the middle of this workflow process.

20:40
And you come to him from a self-service perspective, and we still focus on the Tasks and reports, you know, and in the workflow, what are those steps that contribute to that decision making?

20:51
And, then, what are the friction points?

20:55
There? If you actually go through this, you’ll probably find that the current process might be too reliant on canned reports.

21:02
Could self-service, analytics help with answering questions that eventually satisfy the task, satisfy the sequence of steps, and ultimately get to the goal for the end user.

21:16
As you discuss that current workflow, you’ll, you’ll see issues that when people were going to bring up.

21:23
So, if we get away from the what kind of reports you need, and how they need to look like, and you start talking through them in this manner, you’ll find out what those issues are.

21:32
Some of those issues are, I’ll give you some examples over here, so, users don’t trust the data. That’s a very common comma when we go into a requirements discovery session. And, the way I say, you know, that data warehouse is not really correct, or, these reports aren’t correct.

21:49
What I get to find out, what those data quality issues are, because that’s going to influence your self-service implementation, are they actually data issues, Or, maybe they’re just simple report logic errors. Maybe we’ve had examples where a user is having issues with data, and they run this report, say, look, the numbers are not right, you drill into it. The summit did a wrong calculation in the report. But, they’ve been using this report for X number of years, and they know it’s wrong, but they just fix it themselves, OK? It could be something as simple as that. Or, it could be data quality issues upstream.

22:21
The users, the quote, users, don’t trust the data, can imply a lot of different things.

22:26
And as you go through that workflow, that might be one of the issues that they actually bring up.

22:32
How about another one? It takes a long time to get the answer.

22:36
Could be a simple performance issue.

22:39
That could be resolved with database tuning, but maybe you didn’t know about it until you go through the workflow process. But it could also be, and this is an interesting perspective, a result of a more complicated end user Data prep activity.

22:52
So, when they say it takes a long time to get the answer, and you run the two reports, or you do the analytics and the self-service environment, and said, well, no, logo, it only takes me Nan to get the answer.

23:02
And, as you expand upon that, the end user will say, well, yeah, but, I have to take that down to my environment, and I have to take it down to Excel, I have to run for other reports over here, and in the interim in the BI system, and then, I got to integrate those four together. And, I got to integrate that with data sources from my, my demographics file, on my desktop, that I finally get the answer OK. And you didn’t know that because you didn’t ask the right questions.

23:28
And that could be implemented all upstream.

23:33
I’m going to focus, again, on a very simple example, the user comes to you and says, Well, I can’t I can’t do a simple year to date calculation in Cognos. Again, it’s very specific, but, again, it points to a general problem of semantic layer. We need to be able to put those things, like YTD or same month, last year into the semantic layer. Don’t leave it to the end user. Create those things, find out what they are, embed them into the semantic layer, so they don’t just throw up their hands and figure out, try to figure out how to do something that they can’t do, OK?

24:09
And another aspect of it, something in your toolbox that you can use as you try to get requirements, create a user story based on your findings. We’ve gone through all that. We do our best practice. We try to understand a person’s workflow.

24:24
Type it all out in a nice little set of paragraphs over here, and you can actually pull some nice little nuggets out of what you’ve written.

24:31
So for example, in this quest, and this what I’ve written over here, this is again, an FTE headcount type of example, Johnson, HR analyst. As you type this out, you know, he periodically review the current workforce where it currently stands how the FTE headcount compares the same period last year. So, right there, I pull that out. How does it compare to the same period last year? Joel’s going to want to do that.

24:56
And that’s a clue to me that I need to be able to provide that type of calculation that type of processing in a nice, easy fashion, on a self-service metadata layer aim was to break it down my organization and location C to tenure. He needs to figure out, tenure I’m serving up the FTE.

25:15
Headcount, how’s it going to calculate tenure? I need to have a start date of employee, but how can I, how can you figure that out? Do I have to have make him subtract the current date from the start date and do the average? I could put that in there for him. And the semantic layer, then, you get to the second paragraph. And, as part of what you were fighting show, in terms of issues, yeah, we gave them a package, gave him a Cognos package, and he sat down with our IT LS. They showed him how to work the package, how to query the package.

25:44
He knows, just enough, he knows gets to the point where he can extract data for reporting in Excel, because he’s never been able to calculate that changed an FTE headcount year over year.

25:56
And he finds it again easier just to extract the data in Excel, file reporting. Thirdly on this, on the next paragraph, we write it out. He needs to integrate other datasets into a file reporting.

26:06
So, as you put it all together and write it out, a lot of things just pop out that really help you and discovering what your self-service BI implementation needs to be from the soft, from the, from the soft side of things, OK.

26:22
Now, that was kind of focused on independent of the tool in it, these aspects of implementation need to be kind of independent from the tool. I’m going to pivot a little bit and we’re going to talk a little bit about Cognos self-service architectures, and a lot of us still kind of applies maybe even to your Tableau and Power BI, but we’re going to drill into a cargoes.

26:42
Nowadays can help you with in terms of self-service, um, Cognos 11, 11.1, latest release, 11.1, release seven.

26:53
There are some keys, self-service enablers, and Cognos has always been maybe thought of more like a reporting tool.

27:00
The polling results kinda show that most of it is just as a reporting tool and it’s out there and people don’t realize that there’s a lot of nice new features in self-service enablement with Cognos 11. I’ve put FM packages out there because it’s going to be out there as classic, still useful.

27:20
You’re not going to go out there if you’re an employee of your legacy implementation and Cognos is there you go. On 11.1, you’ve got hundreds of these packages out there. It can still be used for self-service in, Different architectures. Will go through that a little bit. It’s classic, still useful. You know, now we’re moving onto a more user based modeling tool called data Modules.

27:40
We’ll show the links later, but we’ve done a lot of webinars on each one of these topics for allowing you to drill into as to how it helps you. But data modules are the newest modeling tool for end users.

27:52
It’s much like Tableau and Power BI where you, what’s the first thing a tablet does when you point to a date as far as it asks you to bring some tables and join them, do all that stuff. All right? This is what data module is doing for the end user.

28:06
They don’t use FM Modeler to do that, create packages. They use that tool.

28:12
Datasets.

28:14
Power BI, Tableau also has a concept of data extracts, Tableau, data extract, Part B I think even use words datasets, ability for users to extract or data their own subsets of data from your source system so that they perform much faster in the BI tool, versus running queries back and forth.

28:33
OK, dashboards.

28:36
The visualization tool, getting easier with each release’s getting better. I would still say, obviously, people will agree with me out there. Tableaus, is the, the best visualization tool out there.

28:47
That’s where it’s made us bread and butter.

28:50
But the Cognos alone dashboards, especially with the latest set of releases, have gotten to a point where it’s Almost all ones. They are, poor, but, almost there. That’s getting there, definitely getting there. And, if you’ve already got a Cognos implementation, not taking a look at that and just diving into maybe, putting everything over to Tableau, You kind of shooting yourself in that perspective, So, depending on what you want to get done. And, of course, reporting this has always been to work for workhorse of any analytical tool set, right?

29:21
People are always going to want to write reports, you know, a Tableau was never really meant as a Report Writing tool, Power BI, ESA, SSRS, they’re , kinda, they’re there and that aspect. But Cognos has always been a great reporting tool. It could be as simple as a list report.

29:44
Two very complicated, highly formatted pixel, perfect type of reporting. All these components make up, in a sense, the self-service enablers of the BI platform. And I would behoove you to kind of look at that in your existing Cognos implementation as something that you can start rolling out and letting users use for that. So how did we do that from a maybe a cognitive perspective? I’ve got some models here that I’m thinking about from, from Communist perspective.

30:13
Here’s a classic, one, IT driven Enterprise model. This is our classic approach, right? We have source databases.

30:22
We work in, instead of discovering requirements, we gather requirements. We asked them what kind of reports we need.

30:28
We asked, we find out where the data sources are, we developed that FM model, we create a package and maybe, that package is very specific for optimized Cognos report development.

30:38
You create X number of reports for your end users and they’re on their way, maybe, that package them itself is optimized for self-service.

30:50
Or, maybe you create a Cognos as a Cognos package that’s optimized for report.

30:54
Professional developers, you create another Cognos package that’s used for self-service? Yeah.

31:01
It’s all. It’s all there. You could use F and package for self-service. If you do things like we were talking about creating built-in calculations, pre-built, done, done your homework.

31:11
It’s their classic model. It’s all out there.

31:14
Now, that was, that’s all IT driven. All that stuff is owned by IT.

31:20
Data models here’s another approach is kind of similar to the data model.

31:24
It’s IT driven with Cognos data modules: You kind of replace the FM modeling tool and package with a data model that’s created by IT, but it’s a data module. It’s kinda like your first step in trying to move away from FM.

31:38
If you’ve done the homework, and the current feature set of data modules, will be able to replace that feature, set an FM. You know, we know that’s a different topic, although altogether, I’d advise you take a look at one of our webinars, which compares the two products together.

31:53
Again, as more centric IT, but you moved the Data Module now, you move the Modeling development into the Data modular world, which gives you the benefits of built-in relative time.

32:05
Data cleansing, my article, Naval, Navigation Path, and one important aspect of it is that people can augment this with their own external data.

32:17
Your self-service analytics, folks can literally make a link to this data module, and then integrate other Excel sources into it for their own self-service data modeling components.

32:31
On the other spectrum now is what I’m calling the the end user driven model in Cognos.

32:39
This allows your self-service analytic professionals to really do their own map modeling to do their own semantic layers, and, I make this akin to those who use Tableau and, the first thing you’re asked to do is point it to a database, a data source.

32:59
We create the data server connection out there and Cognos. We have a bunch of power users who want to do that. They open up a data module to connect to the database as they join two tables together, and they do that, OK? They produce their own data module. So, the only thing IT is doing here is making sure those databases are up, and that could be a data warehouse.

33:19
We have clients out there who have a lot of old, legacy applications, Things have you ever heard of Power Builder and things like that? ERP systems that were built, custom visual, basic systems out there, and they’ve been around for 20 years, and they’ve got, you know, they’ve got hundreds of tables in there, and that analysts have been working on that system. They write manual SQL queries against those particular systems. But they would love to get in there and have a BI tool go against that. Cognos can do that. You can go ahead and create that connection for these particular people to do that.

33:53
So IT is, all they’re doing is supporting the platform in sport and databases, the end user is driving a model, then they create their own datasets, another subset of users can take datasets from that data module, and integrate it there, and then also integrate their own uploaded files up there.

34:11
I mean, the one drawback of this model, I would say, is data consistency.

34:18
So if you open this model up to folks and they point to the OLTP system, which has the customer tables.

34:26
And there’s a data warehouse of customer tables that’s already been cleanse, it, put business rules in there.

34:31
And for some reason, this Alice gets access to that and said, this is my customer list, then another analyst. So, we’ll notice that the Data Warehouse customer list, it’s gotta be better than you’re going to have that issue, So, it’s another area where data governance, processes had to be there from a foundational perspective in order for this to work.

34:52
Finally, I’ve got this hybrid model and, I think, for a lot of Cognos implementations, this might be kind of like the Goldilocks model in the sense. It’s kind of gets you to a certain point. The FM packages, like I was saying out there, had been out there for out there for a while for large, legacy Cognos organizations. This model is a very unappealing choice because, over the years, many Essen packages have been developed, and they’re very stable.

35:20
And they have great data integrity, and they had the blessing, the central IT organization there, IT maintained, OK.

35:28
So, we could integrate those packages together, and we can still give people the, the capability to do their own data modeling.

35:35
Because, you could, stick that data module out there, and, you could bring in tables from the F and packages, you could extract datasets from FM packages.

35:45
So, for example, you could extract, your customer lists, your product lists, from an IT maintained FM package, which points a data warehouse as a dataset, integrate that into my own data module, which is going against some other system. It could be the old TV system. But, I needed a cleanse product list, customer, I mentioned, I’ll do the integration of myself.

36:05
OK, and then, I’ll also integrate uploaded files, as well.

36:09
So, that’s some of the models out there that I wanted to bring up. Now, I’m going to go to a little demo.

36:15
I’m going to tab over here to my Cognos instance over here.

36:21
And, what I want to show you is just some aspects that from a Semantic layer help, implementation, I got here Cognos 11.1. Release seven. So, if it looks a little different, if you guys haven’t gone to release seven, this is the interface now. We’ve got some cleaned up icons over here, and you design interface that they’re implementing. I’m gonna go here to my content. I have a data module over here.

36:46
Let’s just go ahead and create a dashboard of this thing.

36:50
Pick the blank one over here. OK, here’s my metadata layer.

36:54
I’m doing self-service, and it looks fairly clean. And I’ve got my sales, the sales location, and product.

37:03
Area over here, And under Measures, I’ve got my sales over here.

37:10
That’s my total sales tax. I come up with a nice summary over here.

37:15
And I can go over here, and I could, for example, filter on year, Pedro, we have a request to Can you go full screen on that? So it’s a little better place. All screen, that better. That’s great, OK, no problem. So then I got current year and boom, I’ve done my analytics, current year sales.

37:34
What’s the next step that they wanted the risk, and this kinda hearkening back to what we were just talking about from a self-service perspective? I’ve basically, you know, served up my sales. I thought I thought I’d done a great job from an IT perspective. Served up my measures, I’ve served up my date dimension.

37:52
Could slice by products, by location.

37:55
The next thing a person is going to ask me to do, I want to do from a self-service perspective, is I want to compare current month’s sales last year is about sales.

38:06
So, I’ve done all the classes, and I’ve looked at this, I got this, I can slice it, But then, I don’t know A nice, how do I do that? You know? And a lot of times, people, again, get to that point where I never mind. I’ll download this Excel report, and then I’ll create a dashboard, an Excel. So, let me go ahead and delete this.

38:25
What I want to expose, and we have another webinar that we’ve done a lot of webinar on, really the tips and tricks, but this is really an attempt to show why those tips and tricks ricotta important.

38:35
Under here, I’ve enabled the relative time feature of the data module.

38:43
I mentioned that briefly, where we can enable a relative time feature where by simple drag and drop, here’s my current month’s sales here and here’s my same month, last year, OK. As an end user, I didn’t have to calculate that, but how do I visually show that in a nice way?

39:03
As of a release, for I believe. Or is it five?

39:08
Double check that there’s a new widget called the KPI widget and it allows me to do this in a nice graphical manner.

39:17
So I can take my current month’s sales.

39:21
It’s my base value.

39:23
I could take the same month from last year and put it in my target value.

39:30
The comparison is done automatically.

39:33
I can make this widget look.

39:35
Nicer. No.

39:38
Sales.

39:40
Year over year.

39:44
Current month.

39:47
Brush it up, O Senturus, I go over here to my Properties. and on the chart here, I don’t want to show the label.

39:57
Maybe that’s what I want to show, OK, I’ve done it in one, basically, a couple of drags a little bit of formatting.

40:04
I think its worth it’s worth pointing out, Pedro, sorry to cut in here on it though.

40:08
That just for the Tableau and Power BI users, are those who haven’t used any of the tools that you have to do to make this really useful. You have to create those measures, over there on the left, the year over year comparisons and everybody wants to do that. You have to create those in Cognos. You have to create doesn’t, Tableau. You have to create those in Power BI and kudos to Cognos for actually implementing that that ability to generate those things fairly easily. Pay during the webinar on that justice KPI tool, so, it’s worth looking at that. They’ve all got KPI widgets now, again, I’d give the not the Cognos here, in terms of being the easiest to do what he just did there.

40:45
Their Power BI is, is, is a little, and is, OK. Not quite as good, but it’s a, it’s a widget. And Tableau only offers it on Tableau Server at this point, and it’s more limited, but if you’re really going to implement self-service on any of these, you gotta, you gotta really have those comparisons out there, and understand what those, what those comparisons are, and make sure that they can either create them themselves, or if you have to provide them.

41:10
Thank you, Mike. Yes, definitely. So, yeah, I mean, we started off kind of, independent BI tools. And one of the things where it came from that process was to kind of, we need to be able to do this.

41:21
And that would kind of drilling into the tool itself and showing you how you can do that. We’ve talked about Architectures.

41:25
And this is one aspect of it. I don’t want to show exposing this type of stuff. And then we’d probably organize this a little better. Not just simply exposing the sales metric, not just simply exposing, you know, tax amount, but exposing calculations and metadata that allows them to get to the true business need, to answer the goal.

41:46
is what we’re trying to get you over here. And that’s why I wanted to show you that this as, if we didn’t have that, I still couldn’t do this right. I’d still have to figure out a calculation, I’d have to do something else, right? I made that do calculation and data module. But I’m going to spend time doing that. We’ve done it up their form. It’s easy for you from a drag and drop. The other thing I want to show you which is kind of being on a toolset is being able to add augmented data.

42:12
OK, let me close this over here.

42:17
And, I’m going to bring my window down here. I’m going to actually go ahead and get out of my presentation over here.

42:23
I’ve got a spreadsheet here on my desktop, and this spreadsheet, you know, has good information in here.

42:30
It has my sales by salesperson, and it has a sale goal.

42:36
And it has, there’s the current sales.

42:39
How many of you out there really have very defined Excel spreadsheet processes that you’ve been doing for many, many years?

42:49
And you’re not about to put it into a data warehouse process. But you’ve got a process from A to Z, where you get your own services, you put it all good. You get an ultimate spreadsheet like this.

42:59
And I want to report on this, the BI tool.

43:03
Well, without wasting so much time and figuring how to reverse engineer your process, like to put this spreadsheet up into the BI tool itself, We might eventually get to reverse engineering the process of how we got here.

43:16
But this is really what I want to report on, are so many folks out there who are basically doing basically doing desktop Data prep. And there’s a lot of tools out there that do desktop Data prep as well.

43:28
But how do I get this nakedness? It’s really I don’t know how many people don’t know this, but it’s a really simple thing. You just go over here.

43:36
You drag it on top of Cognos and the file gets uploaded, and it gets read as analyzed.

43:42
OK, and now that spreadsheet is in my Cognos environment.

43:46
So, what was one of the aspects of the semantic data letter was to be able to augment the data with external data and it becomes a very simple.

43:57
Now, this spreadsheet is in my content folder, Here it is, Sales Goals, it was uploaded on September 24th, and what I want to do is modify my data module to add that in there. So, I’m going to add a new source.

44:13
Go to my content, and I’m going to add Salesgirl, Say, and OK.

44:19
There’s my sales go spreadsheet. I’m going to save that.

44:24
I’m going to close this out.

44:27
Goanna go back into that data module here and I’m going to create a dashboard on that.

44:34
Again, there is, I have a new set of data humming along that take several seconds. You know, once people are trained and understand this, I can report off this.

44:42
Now, let’s go ahead and do a target. Go sales to go comparison. OK, so I’m gonna go ahead and drag over that KPI widget again.

44:53
All right, I’m gonna go that spreadsheet.

44:56
I’m going to say, the base value was the person in sales.

45:00
And here’s a sale goal.

45:03
OK, and this one actually has an area of four quarter, so I want to go for this quarter, put it on the filter, and there I have it.

45:14
I’ve done my analysis. I could do reporting off this in the reporting tool, or I have it here on my dashboard. OK, so I picked two aspects of what I was trying to present and trying to emphasize that.

45:26
If you do this right, if you train people right, and you have all these BI tools are almost on par, they just do some things a little differently, some are better than others without such certain aspects, this will really help your self-service BI mutation. So, let me go back to my presentation really quick here.

45:44
Finish it up.

45:48
And go full screen.

45:50
OK, and I gotta switch, sorry, I got two screens over here, and there we go.

45:54
OK, so, let me summarize everything here, as we finish up the presentation.

46:02
So, self-service BI implementations really require a tight partnership with your business partners.

46:10
I try and emphasize that you just can’t install the tool. Serve up some data and expect them to be successful. You need to involve your business sponsors early in the game.

46:23
That’s NAC BI Tool will not mean automatic success.

46:28
But, if you implement it properly, you put a good foundation. You do the upfront work to really analyze your business, sponsors, goals, and workflow processes.

46:39
You had the technology, and you had the business partnership together, then you will get to that successful level, you know, you’ll get to that point to where that implementation will be successful for you.

46:50
OK, so, we have some additional resources out there because I just kinda touched on some things but as you wanted to drill in deeper into the technical aspects of it, we have webinars on Cognos F versus Data modules.

47:05
We have a blog on those two things, data models, and new capabilities, also search out there for that KPI webinar, which is what I showed.

47:14
OK, now, I’ll hand it over to Mike now.

47:27
Mike?

47:32
Got to unmute myself. Sorry, Pedro, Please stick around for the Q&A guys. This is real quick, and we are and get those questions into the question pane. So a couple of quick slides about Senturus. We are the authority and Business Intelligence. We focus exclusively on business intelligence, with a depth of knowledge across the entire BI stack.

47:54
We, our clients know us for providing clarity from the chaos of complex business requirements, disparate and ever changing data sources, and constantly moving targets. We’ve made a name for ourselves because of our strength in bridging the gap between IT and business.

48:09
We deliver solutions that give you access to reliable, analysis, ready data across your organization, so you can quickly and easily get answers at the point of impact, in the form of the decisions you make, and the actions you take.

48:20
As Peter is showing here, our consultants are leading experts in the field of analytics.

48:24
With years of Pragmatic, Pragmatic, real-world expertise, and experience advancing the state-of-the-art, we’re so confident in our team, and our methodology that we back our projects with a 100% back guarantee that is unique in the industry.

48:38
And we’ve been at this for quite awhile. Now, coming up on two decades, we have focused exclusively on business intelligence for coming up on 20 years, working across the spectrum, from Fortune 500 to mid-market companies, solving business problems across virtually every different industry. And functional areas, including the Office of Finance, sales and marketing, manufacturing, operations, HR, and IT.

49:02
Our team is most large enough to meet all of your business analytics needs, yet small enough to provide personalized attention.

49:10
We invite you to expand your knowledge and explore the resources available to you on the Senturus website at the URL shown here. There are hundreds of resources there, ranging from webinars on all things BI to our fabulous, up to the minute, easily consumable blogs on what’s top of mind, Again, solely focused on business intelligence.

49:31
We’d be remiss if we didn’t bring up our complete offering around BI training. We offer training in the top three BI platforms, Cognos Power BI and Tableau, or Ideal, particularly for organizations running multiple platforms or those who might be moving from one to another.

49:46
Can provide training in many different modes.

49:48
Ranging from Tailored Group sessions to 1 to 1 or 1 to a few mentoring, instructor led courses, and even self-paced e-learning, so we can easily match to suit the needs of your user community.

50:03
And then, finally, before we get to the Q and A, Visit,  senturus.com, where we have hundreds of different free resources on our website, what ranging from product reviews to tech tips, and you can also go see upcoming events there. We didn’t place any in this webinar, I believe, But if you go over there and take a look at the events, you’ll see whatever webinars and other events we have coming up on our calendar.

50:25
So, go over there and make sure you bookmark that, And, with that, we come to the Q and A Peter, I’m not sure if you had A a chance to take a look at the question log at all, But the one here upfront is talking about co-development processes, co-development process, OK. That’s interesting. Actually, that’s a really interesting topic, because at a client, we’re helping with a Cognos implementation right now, and actually, new implementations are going on for years is this concept of co-development.

51:04
And co-development goes beyond just figuring out what reports people need. We sit with the client, we show them reports that we produce. But we show them how to create their own reports. We showed them how we did it.

51:18
We have these sessions ongoing from the start of the project to the end of the project so that the customers not just dropped into the Cognos portal and shown how to execute a report. And, then, go, by the way, could take some training sessions on how to do modeling. How to use data models and all that. We, this company who we’ve kind of help along the way, as actually very proactive.

51:44
They are developed, the development team, sits with the, with the client and the customers really understands what they’re, basically the process that we kinda outlined here and gets beyond just creating the reports for them and show them how to do it. Show them how to fish. So, they could, basically has a story there. That’s what, what, what co developing this kinda really kind of means.

52:07
That’s gonna be a cultural shift. It’s gonna be a cultural shift. Because a lot of people don’t like to do that. A lot of IT, or just out here it is that we’re done. It’s a cultural shift in your organization to be able to get to that point.

52:16
Yeah, that’s really the key to it, right? As you have to, they’re used to being either Spoon-fed, thanks for MIT that take forever to get developed.

52:23
And they just consume that or, more likely than, not, they dump it out to Excel and go do all their work over there.

52:31
So, there’s the whole idea of teaching them how to be able to have a conversation with their data. And, and there’s a, there’s a lot to that.

52:41
So, there’s a question about width Power BI. What is the equivalent of the Cognos Data Module?

52:49
Or, FM package?

52:53
Well, it’s just that Make Sure, so, so, yeah, the with the with the other major tools, like Tableau and Power BI, with Power BI, you really have, you have Power Query and Power Query is the data transformation tool that allows you to query different data sources and transform that data. And it’s actually very powerful.

53:15
And I think it’s a fair statement to say that data modules and data sets, the cognitive self-service version of that arose from the pressure, the market pressure that Tableau and Power BI put on Cognos to develop self-service tools.

53:34
So, there’s a lot of similarities there. The Power BI tool, the Pirate Power Query tool is is very powerful. It’s also a little trickier, right, because it uses its own languages, Uses em. Even though it’s graphical, there’s a lot you can do, but it’s a little more complicated. Versus like Tableau has now has relationships, and they have basically an abstraction layer now between what was originally a single, tabular view of all your data that didn’t allow for sophisticated data modeling.

54:02
And, now, they have a, a abstract layer that allows you to, sort of, relate different data, sources, different ways, and allow for things like multi fact joins and things of that nature through their, their data, source capabilities. But, they still don’t have a lot of transformation capabilities. You can do some of that with Tableau prep.

54:21
Cognos, I would argue with data modules. I’m answering more than the questionnaire, but, Cognos with Data Module has some limited transformation capabilities, but doesn’t do a lot in the way of transformation. That’s still very much that the idea that the data that you get should be largely ready for consumption, none of that of modeling and changing some field names and, and adding calculations. So, hopefully, that answers your question.

54:48
Let’s see. What is the recommended Cognos self-service tool for creating reports not dashboards?

54:56
So, those of you legacy why has been around for a while, Cargoes 10 Acer’s, you didn’t know, right? Yes, you had, you had Analysis Studio, yet Query Studio, for studio.

55:10
Well, it’s all been really kinda compressed down to now report, the call, I still call it Reports to you, but Create Report is the report Writing tool and the new series of Cognos 11.

55:25
It allows you to go very narrow and also, very deep and broad in terms of your technical capabilities, so, it’s one place to go for simple reports, simple ad hoc writing reports, but if you wanted to learn more about the tool and dig into how to really create a very highly sophisticated pixel perfect reports, even manipulating the queries in the back isn’t the same tool.

55:46
So, it’s nice and the From the perspective, I don’t know, I didn’t know which one I need to go to its report called Report Studio, the Report writing tool in Cognos 11? Instead of dashboards for most users?

56:00
Who are creating analytical objects independent of the data side of it?

56:04
You are working in the dashboard crit dashboard creation tool, or your credit report, and that’s all there is to its very simple now.

56:14
Yeah. So, and then there’s another sort of question around sources can cubes, either dynamic or transformer, IE Powerplay cubes, be used as a source for data modules?

56:25
What’s the latest thing that I can remember now?

56:28
No mic.

56:29
I know, it didn’t.

56:31
But, Well, I think you can use them in a, in a package, and then, a package can be leveraged into a data module. Right? That’s kinda the method. I’m not sure if you can do it directly. I don’t think you can do directly at this point.

56:43
I know it’s on their slate of things to do, because there’s a lot of legacy customers out there with power cubes.

56:51
Yeah. Yeah.

56:51
So and I think you can still write, Mike, you could still instantiate Analysis Studio or habit resurface itself and 11 because all right. Yes, you definitely can. Yeah. So, that’s a huge will still be around long after you and I are. That’s right. That’s right. And I are gone. Yeah. I will also add Data Modules is getting the most attention and development budget out there. So every time there’s a new release, there’s always something they have added the data modules.

57:25
So maybe that’ll be on their next release. FM is basically being held steady. It’s not deprecated. But they’re not going to add any more new features to it. So data modules will already has features that are, are above and beyond FM.

57:43
But there are still some gaps where there are certain things that FM can do, that only FM can do, like parameter maps and things like that.

57:51
But the lion’s share of development dollars has gone into data modules.

57:57
Yeah, absolutely. Because that’s the move right out of the hot thing. Now, is self-service analytics? So, the net of it, I think, is, you can get there.

58:05
You can enable self-service analytics with Cognos or any of the major tools with Power BI or with Tableau.

58:13
Plus or minus.

58:14
Know, certain features or capabilities and what viewed the path you go down and your relative ease with. That is a function of how you plan for that, and how you handle, not just the technology, but that. Culture shift, and managing that, and that’s where it gets tricky. And that’s what we help our customers with.

58:34
So, unless there are any other questions, I think we covered everything here, and we’re approaching the top of the hour. So if you want to move to the last slide, Pedro.

58:43
First of all, a big thank you to you, Pedro, for a great presentation on a complicated topic, but important one.

58:52
And thank you to our attendees for taking an hour out of your day today to join us.

58:57
We always enjoy seeing you on our presentations. Thanks for your great questions, and we look forward to seeing you on a future knowledge series event, if you need help with any analytics needs, whether it be consulting or training. Please feel free to reach out to us at senturus.com, if you’re old school. And so actually pick up a phone. We have a AAA number there. Or you can always e-mail us at info@senturus.com. Thank you very much for your time today, and again, look forward to seeing you on another Senturus event. Thanks to have a great rest of your day.

59:29
All right, thank you.