How to Successfully Implement Cognos Self-Service


How to Successfully Implement Self-Service Analytics

September 24, 2020

Cognos, Data Prep & Modeling, Deployments & Performance Optimizations, Deployments & Performance Optimizations, IBM Analytics & Cognos

Agile, Governed Self-Service BI with a Focus on Cognos Analytics

Learn what it takes to achieve the powerful reality of agile, governed self-service analytics with any BI tool, and with Cognos Analytics specifically. Expensive self-service BI implementations often serve as nothing more than a simple data extract tool that eventually feeds downstream processes into Excel. Whether you’re running Cognos, Power BI, Tableau or a combination, watch this on-demand webinar to get valuable information for achieving well-adopted self-service that delivers exceptional ROI.

Topics we address about self-service analytics

  • What to expect if self-service analytics lives up to its promise
  • Best practices for achieving the goal of self-service
  • Self-service requirements: discovery vs. gathering
  • The value of the semantic layer/business logic to enabling self-service
  • Pros and cons of the three main types of Cognos architecture
    • IT-driven enterprise model
    • End-user driven model
    • Hybrid model
  • Cognos self-service components

Pedro Ining
Principle BI Analytics Architect
Senturus, Inc.

Pedro joined Senturus in 2010 and brings over 20 years of BI and data warehousing experience to his role. He’s been instrumental in implementing data warehousing systems from scratch and has experienced the evolution of the BI industry through several iterations of BI products including Cognos, MicroStrategy and Tableau.


Cognos Analytics
Cognos Data Modules
Cognos Framework Manager


Greetings and welcome to this latest installment of the Senturus’s knowledge series. Today, we’re excited to be presenting to you about the topic of how to successfully implement self-service analytics.

Before we get into the presentation today, a few housekeeping items. You can use the GoToWebinar control panel to help make the session interactive and while we have everyone’s microphones muted out of courtesy to our presenter, where we encourage you to enter questions through the question pane in the control panel. 

And, while we’re generally able to respond to your questions and do try to, while the webinars and progress generally at the end of the webinar, if for some reason, we’re unable to reply to the question in the webinar, will cover it in a written response document that we will post on

And speaking of, head on over there to obtain today’s presentation.

We always get that question early and often throughout the webinar here. And so, you can go get it at slash resources or you can go over, and select the Resources tab, and look at the Resources Library. Alternatively, the link has been placed in the GoToWebinar control panel, and you can access it there while you’re there. Be sure to bookmark it, as it has tons of valuable content in addressing a wide variety of business analytics topics.

Some introductions. Today, I’m pleased to be joined by my esteemed colleague, Pedro….

Pedro has been with Senturus for a decade now and brings over 20 years of BI and data warehousing experience to his role here.

He’s been instrumental in implementing data warehousing systems from scratch and has experienced the evolution of the BI industry through several iterations of BI products, including Cognos, Micro Strategy, and Tableau.

And he’s a newly minted MCS a Microsoft stack, so congratulations on that, Pedro. We have a quick poll here.

We always like to get an idea or get some input from our presenters here and are, from our source, from our audience. And, today’s poll is, what percentage of your Cognos platform is used solely for traditional canned reporting? Pretty obvious choices here, estimate to the best of your abilities 25, 50, 75, 100%, or don’t have Cognos Or, have absolutely. No idea.

So, go ahead and get your answers in there and we’ll give you should give people a few seconds to kind of mull this over and get their answers in.

Pretty easy questions. We’ve got about two thirds of you responding so far.

Give it a few more seconds.

Seems to flatten out here. So, I’ll close it out and share it. So, about 75% are using it solely for traditional reporting. That’s a, that’s a, or, sorry, 45% are using it certain percent, or traditional camera reporting. This is a tough one to actually read back to people. But then you can see another 20% or so is it for about half for that? And, you know, 15% or so about for solely for that, and then 9%, only, 25%. Interesting. So thank you very much for your insights there. And with that, we’re going to get into the core of the presentation. So I will hand the microphone on the floor over to Pedro. Pedro, take it away.

Thank you, Mike. That was an interesting poll where a majority of existing Cognos implementations are still kind of use for Ken reporting. So, we’ll weave that in. But, we’re going to talk about today, self-service BI. And, it seemed, like, from the dawn of BI, when we started talking about information, management and things like that, way back in the nineties.

Sometime around that time, self-service has always been, kind of, a goal, and in the last few years, it’s been an explosive growth in new modern BI tools.

We have some of them over here, click Power BI, Tableau.

And, of course, there’s IBM Cognos analytics, and Caicos analytics has been kind of in a unique spot where they are they’ve been a legacy for a while.

They’ve been around for a long time, and they’re not necessarily associated, maybe, with, suppose, a modern use of self-service BI, lack, Tableau, and Power BI. We’re going to talk about that later in the presentation.

But we’re going to talk about self-service BI first. We’re gonna kind of pull ourselves away from the technology a little bit.

Let’s talk about, what is self-service BI? I’m sure a lot of you out there can come up with a definition, and there’s probably a lot of them, a lot of definitions out there. I’m gonna pull one out here on the screen, and we can kinda drill into this.

So I have a definition Herr, self-service BI allows business users to access more data sources on their own model, their own data.

When create reports or dashboard visualizations with very little help from IT.

Goal, one of the promises is it could lead to faster and more Agile Data Analytics as compared with traditional BI development.

So if you take that first part of the definition out there, Traditional BI development, allowing users to access more data on their own model, their own data, that was pretty much something that a lot of tools didn’t really want to do.

It, was more of an IT report factory, or meet with you, will model the data for you. Wait a few months, we’ll get back to you.

So, the modern tools have kind of flip that script the bit and say, you know, we’ll, we’ll give you the tool, and you can have everything on your own. Do it your way.

OK, that, supposedly, is going to lead to that faster and more agile the data analytics environment, compared with the old method so that promise of self-service VI.

It’s basically saying, yeah, let’s get the business. Users direct access to data, and then, as long as we give them access to data, we get an extra … will get all our report Burden of MIT, we don’t do anything anymore.

Get access to tables. Give them access to data warehouse.

Yeah. That’s the promise.

Then we’ll have better decision making. It will happen more quickly, scan. We want it to drive business growth. Ultimately, all BI systems, Madame Time, wants us to be able to drive business growth more quickly. So that we can respond out there. And with the advent of big data, that’s even more important.

And another promise is that these new modern BI tools, they’re gonna give you more visual, more insightful, more automated analysis.

You know, the users themselves can create and iterate on their own analytic reports and dashboards, not IT. But then, we have to ask a question.

Then, why do so many self-service BI efforts go wrong? How many of you out there have been and suppose it’s self-service implementations, how many of you have gone out there? And bought 200 licenses of Tableau. People install it. They don’t really use it. Maybe, maybe they use it just for a little bit. How many of you out there have tried to put Cognos out there? And it’s just becomes more of an extract tool than anything else versus self-service BI? Probably the ultimate self-service BI tools Excel still, even to this day.

So what are the issues? What are the communist misconceptions when it comes to self-service BI?

How about this one? Saw IT is gonna buy the tool.

We’re gonna installed it, know, we’re going to go build it up, and from a platform perspective, we’re done.

End users say that we’re going to make that data available, and then give it to the users. But, inevitably, happens is, users will get to a certain point.

You’re going to hit those roadblocks.

And they’ve had minimal training, maybe.

And the modeling, the semantic layer of that, of the tool, only serves up the basic raw data to the point where these users throw up their hands and they can’t use it, OK?

Another scenario where BI tools are in place is, as an extract, till, do the rest of it in Excel, Another misconception eliminates the need for IT or at least minimizes the need for IT.

But, ultimately, the complex data, there’s complex data’s out there and it’s still needs to be modeled. No snippy loading up Power BI or Tableau in, Cognos also Cognos and pointing it to a data source doesn’t remove the need for, maybe, IT. Or an administrator?

Or somebody working with the business to help model that data in a way that, the BI tool can be as a doorway act, like a doorway, to access that IT control data, or a nice semantic layer?

Now, there are, maybe, some users out there who know that backend data, very well. That could be probably in the minority, in a lot of people out there, will probably be fine with that majority of users. However, they’re going to need a properly model metadata layer, to make this stuff work.

Another misconception.

Got a brand new BI tool. It’s going to be a successful project. And spending a lot of money on that.

Ultimately, though, again, your foundational data layer, your foundational, data governments’ your foundational business processes around self-service analytics have to be strong in order for that tool to work.

And another common misconception.

Users will automatically understand how to use that tool. Change management user training are critical. There’s a lot of very smart people out there.

They get to a certain point, but they’re always going to need help, And ultimately, because we’re going right back to the data as a data issue. So these are some common misconceptions.

So, how do we focus?

Why did we focus our efforts to help with implementing BI, self-service platform? Here are some ideas that we can, we’re going to walk through. We’re going to walk through some best practices for this area.

We’re going to talk a little bit more about requirements discovery instead of gathering and we’re gonna focus a little more on Cognos self-service architectures: This is where we will get into a little bit of the tool is more of a slanted towards Cognos, but I think you could even actually extrapolate some of those to whatever tool you’re thinking about implementing.

And maybe you can give us a second thought if you’re thinking about moving away from Cognos to other tools because there’s a lot of things have changed that area.

OK, so let’s talk a little bit, first, about some best practices for self-service BI. I’m sure there are a lot more other people have thought about other things. But let’s let’s talk about this first.

Um, data governments’ now continued use of a very good data governance process for existing data warehouses And sources is very important. Data warehouses are not going to go away simply because users can just take that tool and drop it to the source of the data.

In fact, having a user install a tool and pointing it to your ERP systems, customers set of tables is probably a recipe for disaster.

Data warehouses have over the years matured quite a bit to the point where all the cleansing, all the QA, people have signed off on, things like, this is our, our govern customer dimension. These are our products. These are what you want to look at. It. they’re going to rely on the curation of key enterprise data, is going to be a critical component of continued data quality.

So, when you start implementing self-service BI and start installing these tools, having data governments and having a catalog, and having those users understand where the proper data sources are is very important.

Now, we’ve done another webinars and Tour Assist. Another webinar on this topic was recently showed. It’s on our website, I would advise you to take a look at that, is called, Why Bother with Data Governance? Suggest you take a look at that because that’s going to be a very important piece of your self-service implementation.

Requirements, Discovery, We’ll kind of get, we’ll talk about a little more detail, but no gallach requirements gathering techniques, that go beyond the typical, What do you need? What kind of reports do you need?

You know, we need it, we need to transform that to more of a discovery process.

Get more insights from your meetings, allow you to create better self-service.

We’ll drill on that in a minute here.

Semantic layers, Semantic metadata, the data layer that does not overburden the users. A lot of efforts needed to be made in that area that’s not going to go away. And again, just dropping the tool on a bunch of tables. For users who don’t know what those tables do, I mean, how to join them is gonna be, is gonna kinda lead to not a very successful implementation.

So, but, at the same time, simply serving up a very nice metadata layer, which just has dimensions, and maybe a core metric. Like sales.

It, it doesn’t get to the next to the next step, because we, we end up, in requirement sessions.

Do not need any say, I need to see Sales by Customer, product, by location. You do that, you think you’re done. But, we don’t know that the users need to do other things, like, kind of do a year to date year over year comparison, OK.

They bring up that semantic layer on a nice tool, and they could do sales by the current year, but how do I compared to last year?

I don’t know how to do that, I’m gonna go ahead and extract the data and put it out to Excel, and I’m gonna do it that way, OK, so, they throw up their hands and they give up, and they do? What’s more comfortable?

So, and also, another aspect of the Semantic layer.

We need to have the BI to allow users to modify and augment data with local data And, all modern BI tools do this now, and Cognos does this making sure they understand how to do this because lot of data sources will not have everything. All your ERP system, as your databases, your data warehouse.

Houses have a lot of data, but there could be things that users have on their desktop, spreadsheets that have demographic data that they’ve downloaded from another site, geographic information, census data. Things that you can’t put in your data warehouses really quickly.

And they need to have the agile capability to be able to put that and integrate that into the semantic layer, you know, on their own.

And not wait for IT to do that capability training. Very important. That’s critical. Blanket training approach.

By just putting out a bunch of, uh, videos are, or how to do report writing, may not necessarily satisfy the needs. We’re at a client right now, we’re doing a lot of coaching on co-development sessions.

It really is an aspect of the BI training, and also, what are their issues with the existing analytical process? So there’s a technical side, and an understanding, and, and kinda helping them shift your thinking, culturally, shifting the thinking, from just telling IT what kind of reports they need to kind of talking about, what is it, what the goal is, and we’ll talk about that, too. Performance tuning is a very important.

Obviously, if your system is not up to snuff, and people take a long time to run reports, they’re going to think of it as a failure.

So, we have to have to make sure that IT has the infrastructure, has the platform capabilities to monitor the utilization of BI.

All the artifacts, we have to have a very good usage analytics system to analyze who’s using what, who’s not using it, So, we’re always focused on who’s using the system, but who’s not using the system.

And you’ll find out, maybe, those key business sponsors are not using the system, and you’ll see that you could, you could intercept that particular issue, and interview those folks to understand, what the reasons why. Now, that last bullet there, recognized not all areas are suited for self-service seems kind of anti what we’re trying to do.

But you have to recognize that there are going to be areas in your process and your business, where people still need professionally built IT reports. For things like, maybe regulatory filings, very complicated reports, they have to be in a certain format. So a percentage of your system, if you want to think of it is still a little bit of a report factory.

That’s always going to kind of be there professionally. Develop dashboards, espouse. It has to go out. It has to look specific a certain way.

So go into yourself. So service implementation, and not think that everything is going to be 100% slur self-service. But there’s going to be other areas that are not suited. Just realize that.

So requirements. Let’s talk about that a little bit.

Discovery versus gathering, we always talk about requirements gathering. You know, in the aspect of, we end up at our user sessions. And we’re the BI professionals. And the hairs are business.

They’re waiting for us to go ahead and interview them, and they’re anxious about a BI system that’ll help us help them satisfy their needs.

But we always then, tend to gather requirements sense of what kind of reports duty. Can you give me a samples of those reports? What kind of measures you need? Where’s the data that you want me to get it to? The duties from dashboards, you know, so we ended up that. Then, we kinda get into aspic on what are the tables?

This leads us to us, as a BI professional, into a more comfortable discussion of tables, fields, and reports, and it’s really, basically, stemming from the base question, what do you want, or what do you want me to do?

So, we need to morph our requirements, gathering tech, sneaks into something that will better understand the client’s goals.

You know, we need to be able to understand a client’s business process or workflow.

We enabled to discuss existing pain points.

Then, as a part of this, we can consolidate this with a user story. This is a tool in a sense to try to put all this together and not talk so much about the BI tool of the mutation itself.

So determining an understanding of Cloud SQL, if we talk about this first, this discussion is independent of the BI tools feature.

Oftentimes, the end users or users girl is often in lockstep with their job responsibilities.

For example, you might have a discussion with a C level executive. He wants to see a current snapshot of FTE headcount, for example, how compares the last year, what areas are growing a staffing manager wants to see headcount by company location in which is experiencing the most growth.

These are goals, they’re not talking about fields and reports, they’re out there telling you what their goals are, your director.

We’d like to analyzer also the current workforce, but they wanted to see a breakout of exempt versus nonexempt.

Maybe discover what parts of the organization are experiencing the most attrition.

So those are some of the goals to the right where we’re trying to get to, but then you start drilling in even more.

And we start thinking about, you know, a user’s business process and workflow.

There’s a workflow that usually starts with some sort of triggering event from the left.

What prompts the user to seek more information to seek a goal?

Um, that could be anything. It could be, maybe it’s a scheduled regulatory filing, and he has to start doing the filing, and he has a certain set of tasks to do that. There’s a metric that comes out of line. He gets a phone call from his management team.

And he has a series of tasks that he has to do.

He has to maybe running five reports. You know, he needs met run five reports or mummy. Maybe some of those reports are missing. He goes to you because he’s the middle of this workflow process.

And you come to him from a self-service perspective, and we still focus on the Tasks and reports, you know, and in the workflow, what are those steps that contribute to that decision making?

And, then, what are the friction points?

There? If you actually go through this, you’ll probably find that the current process might be too reliant on canned reports.

Could self-service, analytics help with answering questions that eventually satisfy the task, satisfy the sequence of steps, and ultimately get to the goal for the end user.

As you discuss that current workflow, you’ll, you’ll see issues that when people were going to bring up.

So, if we get away from the what kind of reports you need, and how they need to look like, and you start talking through them in this manner, you’ll find out what those issues are.

Some of those issues are, I’ll give you some examples over here, so, users don’t trust the data. That’s a very common comma when we go into a requirements discovery session. And, the way I say, you know, that data warehouse is not really correct, or, these reports aren’t correct.

What I get to find out, what those data quality issues are, because that’s going to influence your self-service implementation, are they actually data issues, Or, maybe they’re just simple report logic errors. Maybe we’ve had examples where a user is having issues with data, and they run this report, say, look, the numbers are not right, you drill into it. The summit did a wrong calculation in the report. But, they’ve been using this report for X number of years, and they know it’s wrong, but they just fix it themselves, OK? It could be something as simple as that. Or, it could be data quality issues upstream.

The users, the quote, users, don’t trust the data, can imply a lot of different things.

And as you go through that workflow, that might be one of the issues that they actually bring up.

How about another one? It takes a long time to get the answer.

Could be a simple performance issue.

That could be resolved with database tuning, but maybe you didn’t know about it until you go through the workflow process. But it could also be, and this is an interesting perspective, a result of a more complicated end user Data prep activity.

So, when they say it takes a long time to get the answer, and you run the two reports, or you do the analytics and the self-service environment, and said, well, no, logo, it only takes me Nan to get the answer.

And, as you expand upon that, the end user will say, well, yeah, but, I have to take that down to my environment, and I have to take it down to Excel, I have to run for other reports over here, and in the interim in the BI system, and then, I got to integrate those four together. And, I got to integrate that with data sources from my, my demographics file, on my desktop, that I finally get the answer OK. And you didn’t know that because you didn’t ask the right questions.

And that could be implemented all upstream.

I’m going to focus, again, on a very simple example, the user comes to you and says, Well, I can’t I can’t do a simple year to date calculation in Cognos. Again, it’s very specific, but, again, it points to a general problem of semantic layer. We need to be able to put those things, like YTD or same month, last year into the semantic layer. Don’t leave it to the end user. Create those things, find out what they are, embed them into the semantic layer, so they don’t just throw up their hands and figure out, try to figure out how to do something that they can’t do, OK?

And another aspect of it, something in your toolbox that you can use as you try to get requirements, create a user story based on your findings. We’ve gone through all that. We do our best practice. We try to understand a person’s workflow.

Type it all out in a nice little set of paragraphs over here, and you can actually pull some nice little nuggets out of what you’ve written.

So for example, in this quest, and this what I’ve written over here, this is again, an FTE headcount type of example, Johnson, HR analyst. As you type this out, you know, he periodically review the current workforce where it currently stands how the FTE headcount compares the same period last year. So, right there, I pull that out. How does it compare to the same period last year? Joel’s going to want to do that.

And that’s a clue to me that I need to be able to provide that type of calculation that type of processing in a nice, easy fashion, on a self-service metadata layer aim was to break it down my organization and location C to tenure. He needs to figure out, tenure I’m serving up the FTE.

Headcount, how’s it going to calculate tenure? I need to have a start date of employee, but how can I, how can you figure that out? Do I have to have make him subtract the current date from the start date and do the average? I could put that in there for him. And the semantic layer, then, you get to the second paragraph. And, as part of what you were fighting show, in terms of issues, yeah, we gave them a package, gave him a Cognos package, and he sat down with our IT LS. They showed him how to work the package, how to query the package.

He knows, just enough, he knows gets to the point where he can extract data for reporting in Excel, because he’s never been able to calculate that changed an FTE headcount year over year.

And he finds it again easier just to extract the data in Excel, file reporting. Thirdly on this, on the next paragraph, we write it out. He needs to integrate other datasets into a file reporting.

So, as you put it all together and write it out, a lot of things just pop out that really help you and discovering what your self-service BI implementation needs to be from the soft, from the, from the soft side of things, OK.

Now, that was kind of focused on independent of the tool in it, these aspects of implementation need to be kind of independent from the tool. I’m going to pivot a little bit and we’re going to talk a little bit about Cognos self-service architectures, and a lot of us still kind of applies maybe even to your Tableau and Power BI, but we’re going to drill into a cargoes.

Nowadays can help you with in terms of self-service, um, Cognos 11, 11.1, latest release, 11.1, release seven.

There are some keys, self-service enablers, and Cognos has always been maybe thought of more like a reporting tool.

The polling results kinda show that most of it is just as a reporting tool and it’s out there and people don’t realize that there’s a lot of nice new features in self-service enablement with Cognos 11. I’ve put FM packages out there because it’s going to be out there as classic, still useful.

You’re not going to go out there if you’re an employee of your legacy implementation and Cognos is there you go. On 11.1, you’ve got hundreds of these packages out there. It can still be used for self-service in, Different architectures. Will go through that a little bit. It’s classic, still useful. You know, now we’re moving onto a more user based modeling tool called data Modules.

We’ll show the links later, but we’ve done a lot of webinars on each one of these topics for allowing you to drill into as to how it helps you. But data modules are the newest modeling tool for end users.

It’s much like Tableau and Power BI where you, what’s the first thing a tablet does when you point to a date as far as it asks you to bring some tables and join them, do all that stuff. All right? This is what data module is doing for the end user.

They don’t use FM Modeler to do that, create packages. They use that tool.


Power BI, Tableau also has a concept of data extracts, Tableau, data extract, Part B I think even use words datasets, ability for users to extract or data their own subsets of data from your source system so that they perform much faster in the BI tool, versus running queries back and forth.

OK, dashboards.

The visualization tool, getting easier with each release’s getting better. I would still say, obviously, people will agree with me out there. Tableaus, is the, the best visualization tool out there.

That’s where it’s made us bread and butter.

But the Cognos alone dashboards, especially with the latest set of releases, have gotten to a point where it’s Almost all ones. They are, poor, but, almost there. That’s getting there, definitely getting there. And, if you’ve already got a Cognos implementation, not taking a look at that and just diving into maybe, putting everything over to Tableau, You kind of shooting yourself in that perspective, So, depending on what you want to get done. And, of course, reporting this has always been to work for workhorse of any analytical tool set, right?

People are always going to want to write reports, you know, a Tableau was never really meant as a Report Writing tool, Power BI, ESA, SSRS, they’re , kinda, they’re there and that aspect. But Cognos has always been a great reporting tool. It could be as simple as a list report.

Two very complicated, highly formatted pixel, perfect type of reporting. All these components make up, in a sense, the self-service enablers of the BI platform. And I would behoove you to kind of look at that in your existing Cognos implementation as something that you can start rolling out and letting users use for that. So how did we do that from a maybe a cognitive perspective? I’ve got some models here that I’m thinking about from, from Communist perspective.

Here’s a classic, one, IT driven Enterprise model. This is our classic approach, right? We have source databases.

We work in, instead of discovering requirements, we gather requirements. We asked them what kind of reports we need.

We asked, we find out where the data sources are, we developed that FM model, we create a package and maybe, that package is very specific for optimized Cognos report development.

You create X number of reports for your end users and they’re on their way, maybe, that package them itself is optimized for self-service.

Or, maybe you create a Cognos as a Cognos package that’s optimized for report.

Professional developers, you create another Cognos package that’s used for self-service? Yeah.

It’s all. It’s all there. You could use F and package for self-service. If you do things like we were talking about creating built-in calculations, pre-built, done, done your homework.

It’s their classic model. It’s all out there.

Now, that was, that’s all IT driven. All that stuff is owned by IT.

Data models here’s another approach is kind of similar to the data model.

It’s IT driven with Cognos data modules: You kind of replace the FM modeling tool and package with a data model that’s created by IT, but it’s a data module. It’s kinda like your first step in trying to move away from FM.

If you’ve done the homework, and the current feature set of data modules, will be able to replace that feature, set an FM. You know, we know that’s a different topic, although altogether, I’d advise you take a look at one of our webinars, which compares the two products together.

Again, as more centric IT, but you moved the Data Module now, you move the Modeling development into the Data modular world, which gives you the benefits of built-in relative time.

Data cleansing, my article, Naval, Navigation Path, and one important aspect of it is that people can augment this with their own external data.

Your self-service analytics, folks can literally make a link to this data module, and then integrate other Excel sources into it for their own self-service data modeling components.

On the other spectrum now is what I’m calling the the end user driven model in Cognos.

This allows your self-service analytic professionals to really do their own map modeling to do their own semantic layers, and, I make this akin to those who use Tableau and, the first thing you’re asked to do is point it to a database, a data source.

We create the data server connection out there and Cognos. We have a bunch of power users who want to do that. They open up a data module to connect to the database as they join two tables together, and they do that, OK? They produce their own data module. So, the only thing IT is doing here is making sure those databases are up, and that could be a data warehouse.

We have clients out there who have a lot of old, legacy applications, Things have you ever heard of Power Builder and things like that? ERP systems that were built, custom visual, basic systems out there, and they’ve been around for 20 years, and they’ve got, you know, they’ve got hundreds of tables in there, and that analysts have been working on that system. They write manual SQL queries against those particular systems. But they would love to get in there and have a BI tool go against that. Cognos can do that. You can go ahead and create that connection for these particular people to do that.

So IT is, all they’re doing is supporting the platform in sport and databases, the end user is driving a model, then they create their own datasets, another subset of users can take datasets from that data module, and integrate it there, and then also integrate their own uploaded files up there.

I mean, the one drawback of this model, I would say, is data consistency.

So if you open this model up to folks and they point to the OLTP system, which has the customer tables.

And there’s a data warehouse of customer tables that’s already been cleanse, it, put business rules in there.

And for some reason, this Alice gets access to that and said, this is my customer list, then another analyst. So, we’ll notice that the Data Warehouse customer list, it’s gotta be better than you’re going to have that issue, So, it’s another area where data governance, processes had to be there from a foundational perspective in order for this to work.

Finally, I’ve got this hybrid model and, I think, for a lot of Cognos implementations, this might be kind of like the Goldilocks model in the sense. It’s kind of gets you to a certain point. The FM packages, like I was saying out there, had been out there for out there for a while for large, legacy Cognos organizations. This model is a very unappealing choice because, over the years, many Essen packages have been developed, and they’re very stable.

And they have great data integrity, and they had the blessing, the central IT organization there, IT maintained, OK.

So, we could integrate those packages together, and we can still give people the, the capability to do their own data modeling.

Because, you could, stick that data module out there, and, you could bring in tables from the F and packages, you could extract datasets from FM packages.

So, for example, you could extract, your customer lists, your product lists, from an IT maintained FM package, which points a data warehouse as a dataset, integrate that into my own data module, which is going against some other system. It could be the old TV system. But, I needed a cleanse product list, customer, I mentioned, I’ll do the integration of myself.

OK, and then, I’ll also integrate uploaded files, as well.

So, that’s some of the models out there that I wanted to bring up. Now, I’m going to go to a little demo.

I’m going to tab over here to my Cognos instance over here.

And, what I want to show you is just some aspects that from a Semantic layer help, implementation, I got here Cognos 11.1. Release seven. So, if it looks a little different, if you guys haven’t gone to release seven, this is the interface now. We’ve got some cleaned up icons over here, and you design interface that they’re implementing. I’m gonna go here to my content. I have a data module over here.

Let’s just go ahead and create a dashboard of this thing.

Pick the blank one over here. OK, here’s my metadata layer.

I’m doing self-service, and it looks fairly clean. And I’ve got my sales, the sales location, and product.

Area over here, And under Measures, I’ve got my sales over here.

That’s my total sales tax. I come up with a nice summary over here.

And I can go over here, and I could, for example, filter on year, Pedro, we have a request to Can you go full screen on that? So it’s a little better place. All screen, that better. That’s great, OK, no problem. So then I got current year and boom, I’ve done my analytics, current year sales.

What’s the next step that they wanted the risk, and this kinda hearkening back to what we were just talking about from a self-service perspective? I’ve basically, you know, served up my sales. I thought I thought I’d done a great job from an IT perspective. Served up my measures, I’ve served up my date dimension.

Could slice by products, by location.

The next thing a person is going to ask me to do, I want to do from a self-service perspective, is I want to compare current month’s sales last year is about sales.

So, I’ve done all the classes, and I’ve looked at this, I got this, I can slice it, But then, I don’t know A nice, how do I do that? You know? And a lot of times, people, again, get to that point where I never mind. I’ll download this Excel report, and then I’ll create a dashboard, an Excel. So, let me go ahead and delete this.

What I want to expose, and we have another webinar that we’ve done a lot of webinar on, really the tips and tricks, but this is really an attempt to show why those tips and tricks ricotta important.

Under here, I’ve enabled the relative time feature of the data module.

I mentioned that briefly, where we can enable a relative time feature where by simple drag and drop, here’s my current month’s sales here and here’s my same month, last year, OK. As an end user, I didn’t have to calculate that, but how do I visually show that in a nice way?

As of a release, for I believe. Or is it five?

Double check that there’s a new widget called the KPI widget and it allows me to do this in a nice graphical manner.

So I can take my current month’s sales.

It’s my base value.

I could take the same month from last year and put it in my target value.

The comparison is done automatically.

I can make this widget look.

Nicer. No.


Year over year.

Current month.

Brush it up, O Senturus, I go over here to my Properties. and on the chart here, I don’t want to show the label.

Maybe that’s what I want to show, OK, I’ve done it in one, basically, a couple of drags a little bit of formatting.

I think its worth it’s worth pointing out, Pedro, sorry to cut in here on it though.

That just for the Tableau and Power BI users, are those who haven’t used any of the tools that you have to do to make this really useful. You have to create those measures, over there on the left, the year over year comparisons and everybody wants to do that. You have to create those in Cognos. You have to create doesn’t, Tableau. You have to create those in Power BI and kudos to Cognos for actually implementing that that ability to generate those things fairly easily. Pay during the webinar on that justice KPI tool, so, it’s worth looking at that. They’ve all got KPI widgets now, again, I’d give the not the Cognos here, in terms of being the easiest to do what he just did there.

Their Power BI is, is, is a little, and is, OK. Not quite as good, but it’s a, it’s a widget. And Tableau only offers it on Tableau Server at this point, and it’s more limited, but if you’re really going to implement self-service on any of these, you gotta, you gotta really have those comparisons out there, and understand what those, what those comparisons are, and make sure that they can either create them themselves, or if you have to provide them.

Thank you, Mike. Yes, definitely. So, yeah, I mean, we started off kind of, independent BI tools. And one of the things where it came from that process was to kind of, we need to be able to do this.

And that would kind of drilling into the tool itself and showing you how you can do that. We’ve talked about Architectures.

And this is one aspect of it. I don’t want to show exposing this type of stuff. And then we’d probably organize this a little better. Not just simply exposing the sales metric, not just simply exposing, you know, tax amount, but exposing calculations and metadata that allows them to get to the true business need, to answer the goal.

is what we’re trying to get you over here. And that’s why I wanted to show you that this as, if we didn’t have that, I still couldn’t do this right. I’d still have to figure out a calculation, I’d have to do something else, right? I made that do calculation and data module. But I’m going to spend time doing that. We’ve done it up their form. It’s easy for you from a drag and drop. The other thing I want to show you which is kind of being on a toolset is being able to add augmented data.

OK, let me close this over here.

And, I’m going to bring my window down here. I’m going to actually go ahead and get out of my presentation over here.

I’ve got a spreadsheet here on my desktop, and this spreadsheet, you know, has good information in here.

It has my sales by salesperson, and it has a sale goal.

And it has, there’s the current sales.

How many of you out there really have very defined Excel spreadsheet processes that you’ve been doing for many, many years?

And you’re not about to put it into a data warehouse process. But you’ve got a process from A to Z, where you get your own services, you put it all good. You get an ultimate spreadsheet like this.

And I want to report on this, the BI tool.

Well, without wasting so much time and figuring how to reverse engineer your process, like to put this spreadsheet up into the BI tool itself, We might eventually get to reverse engineering the process of how we got here.

But this is really what I want to report on, are so many folks out there who are basically doing basically doing desktop Data prep. And there’s a lot of tools out there that do desktop Data prep as well.

But how do I get this nakedness? It’s really I don’t know how many people don’t know this, but it’s a really simple thing. You just go over here.

You drag it on top of Cognos and the file gets uploaded, and it gets read as analyzed.

OK, and now that spreadsheet is in my Cognos environment.

So, what was one of the aspects of the semantic data letter was to be able to augment the data with external data and it becomes a very simple.

Now, this spreadsheet is in my content folder, Here it is, Sales Goals, it was uploaded on September 24th, and what I want to do is modify my data module to add that in there. So, I’m going to add a new source.

Go to my content, and I’m going to add Salesgirl, Say, and OK.

There’s my sales go spreadsheet. I’m going to save that.

I’m going to close this out.

Goanna go back into that data module here and I’m going to create a dashboard on that.

Again, there is, I have a new set of data humming along that take several seconds. You know, once people are trained and understand this, I can report off this.

Now, let’s go ahead and do a target. Go sales to go comparison. OK, so I’m gonna go ahead and drag over that KPI widget again.

All right, I’m gonna go that spreadsheet.

I’m going to say, the base value was the person in sales.

And here’s a sale goal.

OK, and this one actually has an area of four quarter, so I want to go for this quarter, put it on the filter, and there I have it.

I’ve done my analysis. I could do reporting off this in the reporting tool, or I have it here on my dashboard. OK, so I picked two aspects of what I was trying to present and trying to emphasize that.

If you do this right, if you train people right, and you have all these BI tools are almost on par, they just do some things a little differently, some are better than others without such certain aspects, this will really help your self-service BI mutation. So, let me go back to my presentation really quick here.

Finish it up.

And go full screen.

OK, and I gotta switch, sorry, I got two screens over here, and there we go.

OK, so, let me summarize everything here, as we finish up the presentation.

So, self-service BI implementations really require a tight partnership with your business partners.

I try and emphasize that you just can’t install the tool. Serve up some data and expect them to be successful. You need to involve your business sponsors early in the game.

That’s NAC BI Tool will not mean automatic success.

But, if you implement it properly, you put a good foundation. You do the upfront work to really analyze your business, sponsors, goals, and workflow processes.

You had the technology, and you had the business partnership together, then you will get to that successful level, you know, you’ll get to that point to where that implementation will be successful for you.

OK, so, we have some additional resources out there because I just kinda touched on some things but as you wanted to drill in deeper into the technical aspects of it, we have webinars on Cognos F versus Data modules.

We have a blog on those two things, data models, and new capabilities, also search out there for that KPI webinar, which is what I showed.

OK, now, I’ll hand it over to Mike now.


Got to unmute myself. Sorry, Pedro, Please stick around for the Q&A guys. This is real quick, and we are and get those questions into the question pane. So a couple of quick slides about Senturus. We are the authority and Business Intelligence. We focus exclusively on business intelligence, with a depth of knowledge across the entire BI stack.

We, our clients know us for providing clarity from the chaos of complex business requirements, disparate and ever changing data sources, and constantly moving targets. We’ve made a name for ourselves because of our strength in bridging the gap between IT and business.

We deliver solutions that give you access to reliable, analysis, ready data across your organization, so you can quickly and easily get answers at the point of impact, in the form of the decisions you make, and the actions you take.

As Peter is showing here, our consultants are leading experts in the field of analytics.

With years of Pragmatic, Pragmatic, real-world expertise, and experience advancing the state-of-the-art, we’re so confident in our team, and our methodology that we back our projects with a 100% back guarantee that is unique in the industry.

And we’ve been at this for quite awhile. Now, coming up on two decades, we have focused exclusively on business intelligence for coming up on 20 years, working across the spectrum, from Fortune 500 to mid-market companies, solving business problems across virtually every different industry. And functional areas, including the Office of Finance, sales and marketing, manufacturing, operations, HR, and IT.

Our team is most large enough to meet all of your business analytics needs, yet small enough to provide personalized attention.

We invite you to expand your knowledge and explore the resources available to you on the Senturus website at the URL shown here. There are hundreds of resources there, ranging from webinars on all things BI to our fabulous, up to the minute, easily consumable blogs on what’s top of mind, Again, solely focused on business intelligence.

We’d be remiss if we didn’t bring up our complete offering around BI training. We offer training in the top three BI platforms, Cognos Power BI and Tableau, or Ideal, particularly for organizations running multiple platforms or those who might be moving from one to another.

Can provide training in many different modes.

Ranging from Tailored Group sessions to 1 to 1 or 1 to a few mentoring, instructor led courses, and even self-paced e-learning, so we can easily match to suit the needs of your user community.

And then, finally, before we get to the Q and A, Visit,, where we have hundreds of different free resources on our website, what ranging from product reviews to tech tips, and you can also go see upcoming events there. We didn’t place any in this webinar, I believe, But if you go over there and take a look at the events, you’ll see whatever webinars and other events we have coming up on our calendar.

So, go over there and make sure you bookmark that, And, with that, we come to the Q and A Peter, I’m not sure if you had A a chance to take a look at the question log at all, But the one here upfront is talking about co-development processes, co-development process, OK. That’s interesting. Actually, that’s a really interesting topic, because at a client, we’re helping with a Cognos implementation right now, and actually, new implementations are going on for years is this concept of co-development.

And co-development goes beyond just figuring out what reports people need. We sit with the client, we show them reports that we produce. But we show them how to create their own reports. We showed them how we did it.

We have these sessions ongoing from the start of the project to the end of the project so that the customers not just dropped into the Cognos portal and shown how to execute a report. And, then, go, by the way, could take some training sessions on how to do modeling. How to use data models and all that. We, this company who we’ve kind of help along the way, as actually very proactive.

They are developed, the development team, sits with the, with the client and the customers really understands what they’re, basically the process that we kinda outlined here and gets beyond just creating the reports for them and show them how to do it. Show them how to fish. So, they could, basically has a story there. That’s what, what, what co developing this kinda really kind of means.

That’s gonna be a cultural shift. It’s gonna be a cultural shift. Because a lot of people don’t like to do that. A lot of IT, or just out here it is that we’re done. It’s a cultural shift in your organization to be able to get to that point.

Yeah, that’s really the key to it, right? As you have to, they’re used to being either Spoon-fed, thanks for MIT that take forever to get developed.

And they just consume that or, more likely than, not, they dump it out to Excel and go do all their work over there.

So, there’s the whole idea of teaching them how to be able to have a conversation with their data. And, and there’s a, there’s a lot to that.

So, there’s a question about width Power BI. What is the equivalent of the Cognos Data Module?

Or, FM package?

Well, it’s just that Make Sure, so, so, yeah, the with the with the other major tools, like Tableau and Power BI, with Power BI, you really have, you have Power Query and Power Query is the data transformation tool that allows you to query different data sources and transform that data. And it’s actually very powerful.

And I think it’s a fair statement to say that data modules and data sets, the cognitive self-service version of that arose from the pressure, the market pressure that Tableau and Power BI put on Cognos to develop self-service tools.

So, there’s a lot of similarities there. The Power BI tool, the Pirate Power Query tool is is very powerful. It’s also a little trickier, right, because it uses its own languages, Uses em. Even though it’s graphical, there’s a lot you can do, but it’s a little more complicated. Versus like Tableau has now has relationships, and they have basically an abstraction layer now between what was originally a single, tabular view of all your data that didn’t allow for sophisticated data modeling.

And, now, they have a, a abstract layer that allows you to, sort of, relate different data, sources, different ways, and allow for things like multi fact joins and things of that nature through their, their data, source capabilities. But, they still don’t have a lot of transformation capabilities. You can do some of that with Tableau prep.

Cognos, I would argue with data modules. I’m answering more than the questionnaire, but, Cognos with Data Module has some limited transformation capabilities, but doesn’t do a lot in the way of transformation. That’s still very much that the idea that the data that you get should be largely ready for consumption, none of that of modeling and changing some field names and, and adding calculations. So, hopefully, that answers your question.

Let’s see. What is the recommended Cognos self-service tool for creating reports not dashboards?

So, those of you legacy why has been around for a while, Cargoes 10 Acer’s, you didn’t know, right? Yes, you had, you had Analysis Studio, yet Query Studio, for studio.

Well, it’s all been really kinda compressed down to now report, the call, I still call it Reports to you, but Create Report is the report Writing tool and the new series of Cognos 11.

It allows you to go very narrow and also, very deep and broad in terms of your technical capabilities, so, it’s one place to go for simple reports, simple ad hoc writing reports, but if you wanted to learn more about the tool and dig into how to really create a very highly sophisticated pixel perfect reports, even manipulating the queries in the back isn’t the same tool.

So, it’s nice and the From the perspective, I don’t know, I didn’t know which one I need to go to its report called Report Studio, the Report writing tool in Cognos 11? Instead of dashboards for most users?

Who are creating analytical objects independent of the data side of it?

You are working in the dashboard crit dashboard creation tool, or your credit report, and that’s all there is to its very simple now.

Yeah. So, and then there’s another sort of question around sources can cubes, either dynamic or transformer, IE Powerplay cubes, be used as a source for data modules?

What’s the latest thing that I can remember now?

No mic.

I know, it didn’t.

But, Well, I think you can use them in a, in a package, and then, a package can be leveraged into a data module. Right? That’s kinda the method. I’m not sure if you can do it directly. I don’t think you can do directly at this point.

I know it’s on their slate of things to do, because there’s a lot of legacy customers out there with power cubes.

Yeah. Yeah.

So and I think you can still write, Mike, you could still instantiate Analysis Studio or habit resurface itself and 11 because all right. Yes, you definitely can. Yeah. So, that’s a huge will still be around long after you and I are. That’s right. That’s right. And I are gone. Yeah. I will also add Data Modules is getting the most attention and development budget out there. So every time there’s a new release, there’s always something they have added the data modules.

So maybe that’ll be on their next release. FM is basically being held steady. It’s not deprecated. But they’re not going to add any more new features to it. So data modules will already has features that are, are above and beyond FM.

But there are still some gaps where there are certain things that FM can do, that only FM can do, like parameter maps and things like that.

But the lion’s share of development dollars has gone into data modules.

Yeah, absolutely. Because that’s the move right out of the hot thing. Now, is self-service analytics? So, the net of it, I think, is, you can get there.

You can enable self-service analytics with Cognos or any of the major tools with Power BI or with Tableau.

Plus or minus.

Know, certain features or capabilities and what viewed the path you go down and your relative ease with. That is a function of how you plan for that, and how you handle, not just the technology, but that. Culture shift, and managing that, and that’s where it gets tricky. And that’s what we help our customers with.

So, unless there are any other questions, I think we covered everything here, and we’re approaching the top of the hour. So if you want to move to the last slide, Pedro.

First of all, a big thank you to you, Pedro, for a great presentation on a complicated topic, but important one.

And thank you to our attendees for taking an hour out of your day today to join us.

We always enjoy seeing you on our presentations. Thanks for your great questions, and we look forward to seeing you on a future knowledge series event, if you need help with any analytics needs, whether it be consulting or training. Please feel free to reach out to us at, if you’re old school. And so actually pick up a phone. We have a AAA number there. Or you can always e-mail us at Thank you very much for your time today, and again, look forward to seeing you on another Senturus event. Thanks to have a great rest of your day.

All right, thank you.