Cognos Data Modules New Capabilities


New Capabilities with Cognos Data Modules

September 12, 2019

Cognos, Data Prep & Modeling

Demos of Features that Boost Self-Service Analytics

If you aren’t all that familiar with the data modules capability in Cognos Analytics, you are doing yourself and your data a disservice. Allowing you to connect to and combine the world of your data, data modeling in Cognos Analytics opens the door to self-service analytics and a world of deeper insights. With software release 11.1, Cognos includes some major—dare we say huuuuge—improvements to its data modeling capabilities. New features and capability tweaks that boost the ability of organizations' self-service content creation.

Watch this on-demand webinar to find out what’s to love about data modules. Our in-house Cognos expert, and one of our most sought after Cognos instructors, September Clementin shares her perspectives, discuss intent-driven modeling, calculations and data security, and demo uploading files, creating custom tables, unions, joins and content organization.


Cognos Analytics


September Clementin
Cognos Consultant & Trainer
Senturus, Inc.

September has over 15 years of expert level experience with the Cognos suite. She has helped 50+ public and private sector clients with large scale business intelligence implementations including dashboard design, interactive performance metrics development and data modeling. She designs business intelligence course curriculum, delivers 20+ advanced BI reporting courses a year, develops BI best practice webinars and provides mentoring services for other BI professionals.


Greetings, everyone, and welcome to the latest installment of the Senturus Knowledge Series. Today, I’m pleased to be presenting with you on the new capabilities with Cognos Data Modules which will feature demos of features that help boost self-service analytics in Cognos. First, a couple of housekeeping items before we jump into the main content. The GoToWebinar control panel that you see on your screen can be minimized or restored using the orange arrow. Everyone’s microphones are muted for the so, we can hear the presenters are clearly and don’t have any interruptions, but we do allow you to and encourage you to submit questions via the Questions Pane.

As shown here in this slide.

One of the first questions we get and get repeatedly throughout the presentation is, can I get the presentation slide deck? And the answer to that as an unqualified. Yes. It will be available on, where you can find it on the Resources tab, at the URL posted here. And we also put the link in the control GoToWebinar control panel. You can access it there.

Our agenda, today will do some brief introductions, and then we’ll get into the content which features data module’s, including an overview.

We’ll cover the new features, and we’ll do a demo September, if you could advance the slide, please. After that, we encourage you to stick around.

We will do an overview of Senturus, as well as additional resources, and then we wrap up with Q&A.

So, today, our presenters, I am pleased to be joined by September, Clementine, who is one of our Senior Cognos Consultants and Trainers, who has over 15 years of expert level experience on the Cognos Sweep. She’s helped over well over 50 public and private sector clients with large-scale implementations, BI implementations, including dashboard design, interactive performance metrics, development, and data modeling. She also is pivotal and instrumental in designing our business intelligence course curricula and delivers over 20 different advanced BI reporting courses a year Develops, BI Best Practice webinars like this one and provides mentoring service for other BI professionals, She proclaims that. One of her favorite things to do is to tackle the hard problems.

So if you’ve got a tough nut to crack in your organization, September is the one you want in your corner.

My name is Mike Winehouse were a number of hats here ranging from running our training practice to our Tableau practice and serving in a product management role for our analytics connector, as well as being an emcee for our webinars.

So, we’d like to, As those of you who’ve been on, many of our webinars can attest to, we’d like to get a finger on the pulse of our attendees.

So, we’ve got many of you here today, and we’d like to understand what you’re using in your organization here. And the first question is, what data sources do you use in Cognos? I’m going to launch the poll and ask you all to select all that apply. Are you using packages?

Are you using data modules and or are you using datasets? So we’ll give you guys a minute to respond there.

And I feel a little like an auctioneer here.

We’re at Nan, about 70%.

It looks like we have about 170 people online here, so good sizable audience.

This would be a nice dataset, about 83% of you here.

I’m going to close this out and show you the results set. So, not surprisingly, pretty much everyone using packages, but a good almost half of you are using modules and, that’s a little surprising to me, but that’s great that, uh, well, over a third of you, almost the same number or using datasets, which also tells me you’re probably using a more recent version of Cognos. Great. So, we’ll try not to completely pull you to death here. But we do have a couple other polls, a mascot one now. one of them is we’re interested in finding out are you or anyone in your organization attending planning on attending the IBM data and AI forum and Miami this October.

So if you could let us know that would be great. We asked the question, because we will actually be there. We encourage you to come by and visit us. We’re trying to get a finger on the pulse of how many people are actually going to be at this event. So we’ll give this just a couple more seconds, and then we’ll get into the content today.

All right, great.

You guys are quick on the draw today, So, I’m going to go close this poll out and share it with you all, so, two thirds, you guys aren’t going, 13% are, and another 20% or so don’t know.

All right, fabulous, great. Thank you so much for your participation there. And, so, on the next slide, now, we’ll get into the meat of the presentation now, hand the microphone over to September, September. The floor is yours.

Thank you, Mike. Thank you all for joining today’s IBM Cognos Analytics Data Module’s Webinar. So, while data modules have been available since the very first release of Cognos analytics with the most recent 11.1 releases, the new features and enhancements really provide organizations with the ability to take that self-service content creation to the next level.

Today, we are going to first understand what data module, or, if you’re new to those, some of the new and enhanced capabilities, and we’ll see a demonstration of some of those exciting new features and action. We’re going to upload several local files, will learn how to create unions and joins and experience some of the new data module organization features all through the web browser.

So, first off, what are data modules Cognos Analytics? Data models are Data Source objects that encourage users to perform self-service analytics through the ability to combine data from multiple data sources, and in a user friendly interface, empowering a simple, immediate, and shareable data integration solution.

Data models can be based on a combination of existing data modules, database connections, local files, and that would include Microsoft Excel spreadsheets and text files that contain comma, tabs, semi colon, and pipe separated values. They can contain data sets or subsets of data from existing packages, as well as live connections to existing relational framework manager packages.

Join relationships cardinality join filters can also be easily managed by business users using non-technical terms and queues.

So extending new features include a redesign of the Data module interface, which includes the ability to filter by data source type, and most recent modification timeframe.

Also, intent driven modeling. This allows users to provide key terms, with smart type ahead metadata based suggest gins, and combines those with the underlying source metadata to propose tables to include in the data module.

When possible, the proposal is a star or snowflake of tables if that isn’t appropriate than a single table or perhaps a collection of tables is proposed.

Another exciting enhancement is the ability to create custom tables through a variety of table operations, including creating table views, joins, unions, intersections, and exceptions, and we’re going to see a demonstration of somebody’s here shortly.

The new metadata organization Customization Options are also a great enhancement for those of you who might be familiar with Tableau or other Business Intelligence tools. One of the nice features of some of those tools is the ability to organize your metadata according to your own business needs prior to the newest of the Cognos Analytics releases. This was something that could be done in a framework manager.

But now with data modules, you can organize your data items in folders, which not only enables a more user friendly data source. If, for instance, you have a large number of data items, that users would typically need to scroll through to find desired elements. But it’s also incredibly helpful in creating and maintaining a single data module that can accommodate varied reporting needs and audiences.

Properties such as usage, aggregation, geographic properties, and default formatting, can also be defined for your data items.

Another great enhancement is the calculation editor. Custom calculations can be created at the data module and now also at the custom table level. So, simple calculations can be created by just selecting the items to include and choosing the calculation type.

When you’re using the calculation editor, you can drag and drop or double click columns.

In your Data module tree, to add them to the expression editor, and to enter a function, you can simply type the first character of the function and choose from a drop-down list of selections.

Data model, row level, security is another exciting enhancement. So this feature is available when your data module source our database tables at the moment, but security can be assigned at the user, group, or role level.

A few tips that will hopefully prove helpful when you are in your data module planning stage, or even, if you’re perhaps troubleshooting, when you get some sort of unexpected results.

Data, granularity, data granularity differences between Data Module sources may need to be addressed and granular. Challenges, of course, are not specific to data modules. You often have to find solutions to granularity differences when you’re crossing namespaces within a single Cognos FM package and a report.

Same thing. And most BI tool, same thing in, for instance, a Tableau when you’re using joins or data blends, you might need to create table calculations, level of detail calculations, or some other technique to address those differences.

Within the data module, what you might try would be creating table views, all turn in your measure aggregation settings, or perhaps creating calculations that will address those differences. And, of course, granularity differences can often be addressed by creating unions, joins, or calculations directly in the report.

Another tip, data format must be consistent between fields use for joins and between all columns, when you are creating a union, between tables and, of course, more complex modeling requirements, like comp, flex security requirements or parameter maps, versioning, those sorts of things. They may be best suited still for framework manager.

Let’s go ahead and move into the live demonstration portion of today’s webinar and see some of these exciting new features in action.

So, this is our IBM Cognos Analytics homepage. Today, I will be working with the go sales dataset which comes with every install of Cognos Analytics.

Already created a few Excel files using the dataset just to demonstrate uploading those local files and bringing those together in a couple of different ways in a data model.

So our use case today is that we receive sales data from our three regions. Are Americas, Asia, Pacific, and Europe regions, and separate files? Right, not uncommon that sometimes you need to create unions between datasets, but we need to create reports that combine data from all regions. So first off, I’m going to upload this file. So bottom left-hand corner here, I’m clicking New and upload files.

I’ll select my three files for Americas, Asia, Pacific, and Europe, and Open.

And so as those files are being read, you can also click on your details here to see the progress. You can click view details for each of those, and figure out what phase it’s in, if it’s in discovering, analyzing, or if it has been successfully uploaded.

And you can actually expand all of these to see. You can see some of them have already been uploaded. Now, they’ve all been uploaded successfully. So I’ll go ahead and click close.

I can see these in my recent list of items here in the middle of the interface. But by default, your uploaded files will go to your my content folder. And so if I click on my content, I can just verify that those uploaded files are there under my content.

So, next, we’re going to combine these three files, in a data module to enable Reporting across our regions.

Again, bottom left-hand corner, I’m clicking on new, and selecting Data Module.

And by default, here, I’m on my Content folder, but I can also go to my team content, if my sources existed there as well.

I’m going to stay here and my content and just control click all three of the sources, and click OK.

And so now, here on the left, I can see my three sources. If I click first, just on America’s right here in the middle of the interface. I am looking at my grid view, so I can actually see the columns that exist in that underlying Excel file.

I confirm that I’m seeing what I what I think that I should be, and I want to combine these into a new custom table.

So, I’m going to control click All three of these tables, click the more tool and select, Create New Table.

So, now, some of these options are grayed out so Cognos well determined based on what you’ve selected which are appropriate. So, the ones that are grayed out are creating a copy, creating a join, and intersection, or exception. The two that are available for selection are creating a view, creating a union, and those are the two that makes sense based on what we’ve selected.

And we will choose Create a Union of Tables, and click Next.

And now, I’ll just name this new table all regions.

And click Finish.

And so before adding on, I just want to confirm that things look good. So I’m going to hide the three tables here. I don’t want to confuse my end users by having this separate these separate tables, along with all regions. So I’m just going to choose to hide these. And then in the upper right-hand corner, I’m clicking to try it tool, just for testing the data module to make sure that things are working correctly.

And it looks very much like the report authoring load. Right, I have my inscrutable Objects pane on the right, my Properties pain. On the left, I can switch between different ways of viewing the page, pretty much the same features that I see when I am and report offering.

I’m going to expand my all regions and double click on sales region, and just confirm that I do, in fact, have all of those regions. And I do. I see Americas, Asia Pacific, and then my two sales regions that were in that Europe file as well.

I’ll also DoubleClick product line to add that to the list, and revenue.

Like that again, and so, just as I would, and a standard report, I can group my columns.

I could add some summary values, maybe just to do some checking, to make sure that, that union, in fact, did happen correctly and my values are a match. I can even adjust my data format if that’s helpful.

Alright, so if I go to my Data format property, just as I went authoring, I can choose a format type. Perhaps I won’t currency. And I don’t want to see any decimal places. All of these sorts of things are available when you are in the Try it mode.

Notice, though, in the upper left-hand corner, I do not have a Save icon as a wood and report offering. So this Triad tool is really just for data module testing. So if we wanted that revenue data item, for instance, to already be formatted as currency when zero decimals for end users that is something that we can build in as a default in the data module. Now, building in those defaults does not alter the underlying data. We’re not rounding or anything like that. It would just be the default way that our revenue would be formatted in a report, an end user could, of course, change that, and include decimals, and so forth.

So I’m going to go ahead and Close out of this tab.

And let’s add on a bit. I’m going to go ahead and save what we have so far, here in the upper left-hand corner. I’ll do a Save As and save in my Content. I’ll save this as Sales Data module.

So, next, we’re going to explore some of the new data module, organization features. We can create folders, modify, data items, sort order, set default data formats to enhance the end user reporting experience, as well.

I’m going to select my all regents’ table, and click the More Tool, and choose Create New Table.

I’ll create a table called Measures.

I’ll choose Quantity in Revenue, my only two measures, and just drag those right into the Measures folder.

I’ll create a second folder under all Regions.

And name this dimensions.

And within these folders, you can also create sub folders. So, within my dimensions folder, I will create another folder.

And name it Location.

And they’ll select sales, region, and country, and drag those right into the location folder.

I’ll add another subfolder for dimensions for product.

And we will choose Product Line, product type, and Product, and drag those right into the product folder.

I’ll create another sub folder order.

And choose my order number and order detail code data items and drag those into order.

And I’ll create one last sub folder.

And this will just be miscellaneous.

And I’ll choose this row ID, which is grayed out, and that’s because that by default, that is hidden row ID and year. And I will drag those to Miscellaneous.

Here we go.

And so also, the order in which your folders are displayed can be altered. So, I will drag location here, above order. Same thing product.

I’ll drag that above order, as well.

Then, the last thing that we want to do is set our default data format for our Revenues, so that end users don’t have to make changes to that every time they drag that in. So, if you click on Revenue, and actually any of your data items and click your more tool, you’ll see a lot of different options for things that you can do to customize your data module. You can add filters, create calculations, custom data groupings, you can hide, items, rename them.

I will go down to Data Format and set the default Format type to currency with zero decimals, and click OK.

And I will save, and then, again, click Try It to see what our new organization looks like from an end user perspective.

So now, if I expand all regions, I have a really clean looking model, here, where I have two filters for my measures and dimensions. I can expand dimensions. And I know that I’m creating something that is region and revenues.

So I can go right to location, double click my Sales region, go right to measures and DoubleClick revenue.

And notice here, as well, that my revenue column is formatted as currency with Zahra Decimals.

All right. So, things are looking good.

I’m going to close this Triad tab here and let’s add on to this.

So, the granularity of our sales data is at the sales transaction level, meaning for every order, there could be one or more products ordered or detailed transactions for that order.

Our current Data module only has Quantity and Revenue data for our order detail.

Let’s say we want to be able to include gross profit data as well, but that exist in a separate Excel file, also at the same grain or order Detail code level.

We’re going to upload that new file and create a join to the new data source to provide that additional detail.

So, here at the top of my Data module plane, I’ll paint.

I’m going to click my ad source and choose Add New Sources, and I’ll click on my Upload tool and upload that gross profit data source.

Again, you can always check the status of the upload, as that’s moving along by clicking on your details.

So now, I remember. Sorry to interrupt, and we got a request to just slow down the pace. Just a tad. Sure. Absolutely.

So now in our data module pane here, on the left-hand side, I can see that I now have both Tables, separate Tables, Gross Profit.

And I can see the columns that exist in that Excel file.

And then underneath that, I have my all regions, which is what we’ve been working with here. So I’ve got two different tables.

So now I click Create a table view. As we did in the last example, I created a union between three tables, so I could, in fact, do the same thing.

Could choose my gross profit all regions, and I might say, I want to create a new table. And notice now, I have the option to create a join, because that makes sense based on what we’ve selected.

I’m going to cancel that. I don’t have to create a brand-new table, though. If I just wanted to create a join between these, and I’m fine with them existing in different tables for my end users, I can just simply create a join.

So for this demo, I’m going to switch over to the Diagram view, right? And so, I’ve gotten, you can re-arrange this in any order that makes it easy to figure out, kind of what’s going on.

I’ve got my all regions, and that includes Americas’, Asia, Pacific Europe.

Now I have this new table that I’ve uploaded which is gross profit.

So now instead of creating a new custom table, in this case, I’m going.

To just create a join, so I’m silting all regions.

Control right click and create relationship.

Can I have the right and left side of this join definition?

Here on the left, table one that is my all regions table on the right table to my gross profit table.

So now I know that what I want to join on my lowest grain is order detail code. So, I can select that here under gross profit, and you get a sample of data what’s inside of that. So I can confirm that. That is, in fact what I want to join on.

And then, under all regions, I’m searching for that same data item. So I’ll expand my dimensions.

Expand order, and, again, select order detail code.

Now, they don’t have to have the same name for this join, but we just need to make sure that, again, following our tips, that our data types are consistent for what we want to join on, and that it is data that does, in fact, go together.

Again, on the left-hand side, again, we’ll see a sample of that data so that we can confirm.

Then we’ll select Match Selected Columns.

If you scroll down a bit to the bottom here, bottom right-hand corner, we can see that we have one that matched column.

If we wanted to make a change, we can always remove that match, right, choose something else. You can also have multiple matches.

So if that’s necessary to get at uniqueness you can have multiple join clauses bottom left-hand corner.

You also have the options for relationship type cardinality optimization.

I’m going to change this to a 1 to 1.

In our dataset, we have, for every record of order detail code, we have a matching record for gross profit as well, in our sample data.

And click OK.

So we’ve addressed, or join, if we wanted to review that join, or just kind of figure out what’s going on with that, you can always just hover or select that join object, and it’ll get a description of what that join, actually is.

So now just a bit of cleanup for our data module, I’m going to expand Gross Profit. Row, ID is already hidden. I want to also hide my order detailed code because we already have order detailed code in all regions, right, so I will just hide this instance.

Gross profit.

I’ll go ahead and set the default data format for this as well as we did with revenue, so that when users bring that in, it is also, by default, formatted as currency with zero decimals.

Then finally, you can also re-arranged the orders in which the tables appear.

So I will drag gross profit below my all regions.

And let’s go ahead and save this.

And now, for testing this time, instead of going to try it, we’re going to go ahead and just build out a report with this.

I’m going to create a new report.

I’ll choose the built-in one column template.

So now, again, we’re in report authoring mode. Looks very similar to what we saw when we were trying out our data model except that. Also, you will see some other items. Like, say, for instance, in the upper left-hand corner that we can see this report.

So, now, on the left-hand side, under Suitable Objects, I’m clicking here to add a data source.

And from my my content, I will select the Sales Data module and click Open.

And I will click the click, add here in the middle of the interface.

Chews Visualization. And I’m going to go with one of our chart objects that will allow you to really showcase all of the different measures that we have in our data module coming from different sources.

So, I will choose charts, and I’ll start this off with just a simple clustered column.

And, I’m going to go ahead and make this a bit bigger, right off the bat. I will change my size and overflow properties to make this 1200 by 600.

I’ll expand that all regents’ table. And, again, I can see my organization here of dimensions and measures.

Expanding dimensions and location. I will drag in sales region. I want that here on the X axis.

And I will nest below that product line.

And you may get some warnings here about the layout. We’re OK with that and page design.

I will expand my measures and drag in revenue into this series.

Then I’ll also add a combination here.

Scroll up and the property pane to the combinations property, and add a secondary axis.

So to my primary axis here, right, really mix things up and show that these different data sources are working together.

I’m actually going to bring gross profit in as a peer of revenue side-by-side bars.

And I’ll bring, I’ll actually select the secondary.

And let’s make this, instead, a line chart and drag over quality.

And let’s go ahead and run this and see what we have here, run this HTML.

So we can see we’re able to create a visualization with data from essentially four different data sources. The first three are the result of a union. So down here on our X axis, we have all three of our data sources with region data, Americas Asia, Pacific, and our Europe data.

And we have our product lines nested within that.

Then, we have side-by-side bars of revenue, again, revenue, the result of our union, and then gross profit, the result of the join, and even quantity here on a secondary axis.

So, again, so much exciting new functionality added to data modules. This is just a sample of some favorite features that clients have been requesting, but if you would like to learn more about all the new features, they are covered in detail in our new Data Modules Fundamentals Course here, as well.

And, with that, I’m going to hand things back over here to Mike.

Good stuff, September. Thank you very much. We’re going to ask you one more poll here. We encourage everybody. Please stick around, we have all the Q and A at the end. Submit your questions in the Control Panel, and we will get to those at the end. With the caveat that if we don’t have enough time to answer all the questions, or if we have to do research on something, we will fill out the question login, post that along with the recording and the slide deck that you can watch later on or refer to later on. So, the last poll we have is, what version of Cognos are you currently running? I’m going to launch that poll give you all a second to answer that. So, your choices are eight version of 8, 10.X, 11.X, and, in other words, 11.0 or, if you’ve upgraded to one of the 1001 versions, please respond to that.

OK, we’re up about 75% here.

Give me a couple of seconds to respond to that.

OK, and I’ll share, so, happy you all are on a current version, 11.1, something, and then we get we’re hitting close to 85%, or at least on version 11, and then the rest are kind of lagging down with version 10. Thank you. That’s very helpful information.

All right, so coming back to the slides, I’ve got a couple of offers here for you before we do, Again, a quick intro on Senturus and we’ll get to the Q and A, so please stick around for that. The next slide, September.

If September alluded to this, if you like, what you heard here, we invite you to visit, where we have a new offering, the self-service data modeling with data modules, which will be, generally run by someone like, September or one of our other professional trainers. So, you can register for that over there.

That’s a one day class and, if you need more specific assistance helping with your Cognos environment, we offer several different options here ranging from a Quick start upgrade, which will help you with the installation configuration of a dev environment and document that for you to a full service upgrade implementation, which will do all of that across your appropriate environments. In other words, you know, QA and production. Or, if you need more specific mentoring or tailored expertise, we have our BI experts on demand. We will give you access to our entire team, a PI experts that can fill in, as needed. You can reach us at the e-mail below, I put our contact information at the end of the deck, as well, so you can always reach us. A couple of quick slides, in terms of who we are, its Senturus, September, if you want to jump ahead, one more slide. Our clients know us for providing clarity from the chaos of complex business requirements, myriad, disparate data sources, as well as constantly moving targets. We’ve made a name for ourselves because of our strength at bridging the gap between data and decision makers.

We deliver solutions that give you access to reliable analysis, ready data across your organization, so you can quickly and easily. And answers, at the point of impact, in the form of the decisions you make. And the actions you take are consultants are leading experts in the field of Analytics with years of Pragmatic, real-world expertise like September.

And experienced, advancing, the state-of-the-art were so confident in our team, and the Senturus methodology that we back our projects with an, with a unique 100% money back guarantee.

We’ve been at this for a while. We have over 1500 clients, across 2500 different projects, and Banat this for nearly two decades. At this point, our projects range companies from the Fortune 500 down to the mid-market, and across every line of business, ranging from sales to finance HR and across the organization. So, if you have an analytics project, we hope that you’ll consider leveraging our expertise in your next project. We have some great additional resources here that we’d like to bring to your attention. First of all, we’ve got a couple of great, upcoming events.

The first of those being are very successful comparison of Power BI, Tableau and Cognos where we do a demonstration and compare data loading Data prep and building of dashboard visualizations across those three major platforms, So, we’re Reprising that. And enhancing it for Thursday, September 26th.

You can register for that, as well as other events, its We’re also doing a, an exciting new on an enterprise security Tableau versus Power BI. Or, we’ll do a demo and compare approaches to key security concerns there. Again, you can sign up for that over at We have a ton of free resources over at

38:03 where in addition to the aforementioned upcoming events, you can access our full resource library which has all of our past webinars, again, recordings and slide decks and question logs on on all kinds of topics, ranging from tips and tricks to best practices and BI, et cetera, et cetera.

And our blog, which contains nice bite sized morsels from various presenters and, and within Senturus, talking about what’s top of mind and the latest and greatest in the industry.

I’d be remiss if I didn’t talk about our extensive training offerings. If you go to, we provide training in the three top tools: Cognos, Tableau, and Power BI. Our ability to be conversant across multiple BI platforms makes it ideal for organizations that have embraced bimodal BI or running multiple platforms, or those, perhaps, moving from one to another. We offer training in a number of different modalities.

We offer corporate training, self-paced learning, mentoring, and instructor led online courses to help meet whatever your training needs are.

So, thank you for your attention to those. And we will now flip over to the next side and jump over to the Q&A.

September, imagine you’ve had a few minutes to look at this, and hopefully the flagging those helped out a little bit. Did you have a question that you wanted to lead with, or do you want me to kind of Tone up here for you?

I’ve got one that I just saw. There was a question on number of rows for Excel for those uploads. So, yeah.

So, the size limits for individual users, the default limits, are hundred megabytes for an individual file, and then there’s a 500, those are all modest. You can modify those so you, your sys admin can go in and move those up if that’s needed, as well.

So, it’s just really going to be the organization specific about which groups or folks can upload what file sizes.

Got it. That’s helpful. And then, are you aware of what Cognos capabilities should be granted for users to be able to use data modules? I know what licensing level they need, but I think there’s a difference between that and the capability.

And so, I’m not sure.

So, yeah, I would say if your end users are trained to understand things like joins, and it also depends on what you’re connecting to. So in addition to local, it’s just local files. That’s probably a little bit easier. But you can also connect too directly to databases. So if you’ve got end users that are touching databases and then joining those two other things than, probably quite, a bit more training might need to go in to make sure that that they understand what those relationships are and what the impact would be.

Right? I know that I understand.

You know, one of the, one of the things that might compel organizations to leverage data modules, as was brought up, that the, to use framework manager, they need to be a, a cognitive analytics administrator, effectively. Which is a much more expensive license and the Cognos Analytics user, or the Cognitive Analytics Explorer, which allows you to, from a licensing perspective, use data modules and our datasets. We had another question about, you know, formatting columns and whatnot.

So are you able to do things like format, you know, this person asked about formatting column of revenue in dollars, can you format a date column into to other permutations you can you can do with that to to enable you to use it for grouping purposes in a report, for example.

So at least you can format date, you have a lot of formatting options and if for some reason you have a really specific formatting need, that’s not covered. You can also create your own custom calculations and maybe do some sort of concatenating and create whatever it is that you need for grouping that your end users are used to seeing if that’s needed as well.

Good. Are you able to create a package from a data module, or leverage an existing package?

Yes, you can leverage existing packages in your data modules, you can connection directly to the framework manager package.

And then, also, join that just some other source, as well. You can connect from our grantees, or package, and Excel, or it could just be framework Manager Packages. And then you also have the option from a Framework manager package to create a dataset, some subset of data from that package, and include that in your data module instead or in addition to.

OK, yeah, that’s great. And that sort of speaks to the next question, there, were there. This person is asking about joining data sources in a table and data modules.

I think they’re sort of asking if you can join date multiple data modules or datasets and or join those, maybe two packages. Can you speak to that a little bit?

I would say it depends. Because if you are joining directly to A, depending on the sources that you’re joining two, you may or may not have control over the data type. And that was kind of one of the one of the goths that we talked about earlier. Right. If you have to make sure that those data types are consistent, if it’s the key for a join, if it’s not it just as some other column, then you should be fine.

But if you’re, it’s a key for a join, or if you are doing a union, that’s when you have to make sure that those, that those data types are consistent.

Yeah. That makes sense. And then, can you speak at all to how Cognos manages this? Because, and I think what this person is getting at is, you know, are you pushing it to the cognitive server?

And, I imagine the answer probably is, is, you know, It depends, right, because it’s going to try to push processing down to the database where possible.

But, when you start, you know, maybe combining things, and it has to, excuse me, pull things in and process locally, but maybe I could talk about that.

Yeah, that’s exactly right. If possible, you know, that’s what Cognos will do, but there may be use cases where it will be local. Absolutely.

And then, also, depending on calculations, just, like, weather report, depending on the calculations that you use, they’re there can also be local processing that happens as a result of that.

Can you speak at all about best practices? In terms of aggregations. Now, this person that asked about, for example, calculation of percentages so that your aggregations work properly and then a further asks if you’re able to move the calculations into different folders. So can you do some of the organization stuff?

So, again, I will say, it depends.

If you are, if you’re creating calculation, so, something that is a percent or something like that, that works great, oftentimes, when you’re at some sort of a row level list or that sort of thing, but, if you want to create group aggregates or something like that, then Cognos is may tried to total those percentages if it’s not clear. So, depending on how you’re planning to use that in a report, you might choose to have some percentages, depending on the type of reporting, but have the actual items that would be included as the denominator and numerator. So that if the end, users need to report on it in a slightly different way, or they need to create some sort of summaries that are a bit different, or do you need to?

Are working in a cross tab, And I’m kind of getting into the, into the reporting piece of this. But if you’re in a cross tab, and you need to modify the solve order so that, that, that denominator, that division happens at a certain time, then you could get into trouble if you just have the calculation there, instead of having the numerator and denominator available for your end user.

Hmm, that sounds a little like with a problem that Tableau table calculations, kind of tackle aggregations that depending on how you roll things up, are always a tricky thing to do in your data.

We had a question, so I will say, kind of one last thought about that, is that.

so if you have different grains of data that you’re using in your data model, you may need to take the aggregation off instead of having it. So revenue, or forecast, or whatever the cases you would typically leave it at total often. But if your forecast is at a different grain than your sales, then you may need to set, depending on your again, your use cases, you may need to set the aggregation for forecast to none instead of total. So that things are rolled up properly.

Got it.

And then, someone asked about if there’s improvements in the dataset functionality, and latest Cognos version, And I think based upon what we’ve You know, what we’ve researched and have in our environment the functionality in in datasets and whatnot as is largely unchanged at the point of 11 1 3.

Aside from I’m sure some bug fixes and stability improvements, but they haven’t really I’d say they haven’t really changed the underlying functionality to any significant degree between what you just demoed and what’s out there right now.

Right? That’s right. And I think that we could say the same thing for what’s coming in 11, 1.4, from what we’ve seen of the, of the sort of roadmap information that’s out there.

Let me see here. Sorry, I know. I’m not. I’m looking through some of the questions looking for another one here.

This one’s around parameters and dashboards. I’m not sure how relevant that is here. Do you know if ethanol if planning analytics, you’ll be able to use Modules Andorra cubes? I know that’s kind of a little outside this. But, do you have any ideas on that?

Not sure. I’m not sure about that, yet.

We can always definitely research it and get back, though. I wasn’t sure. And then, this is a big question out how far down the hole you want to go on this one. But, this gentleman asked about suggestions on best practices for organizing the data sources and Cognos analytics best practice for your Data? Where’s Warehouse organization For BI usage now? That’s a really big topic.

I mean, I think I’ve seen, you know, in terms of organizing the data sources, that I see a lot of organizations, they have a packages folder, and it’s, it’s organized under their by, say, line of business or organization, and that, can, you know, that, that will vary by organization. I don’t know if you have anything you’d add to that.

I would definitely agree with that. As far as what I.

Seen in the field, it’s largely based on your end users are used to seeing things. So, depending on organization, they might be fine with seeing several different tables. And knowing, Hey, I’m in finance, I’m going to go to the finance table, and I’m not going to cross over into the procurement table. Whereas in other places, that might just kind of be a little bit too much, and it might be better to have those separated out. So, really, I would say it’s really organization specific.

Yeah, and I think that’s the million dollar question, right? When Scott here gets into what’s the best practice for your data warehouse organization for BI usage? And really? what we’ve always espoused is the idea of the, you know, sort of the Kimball structure with conform dimensions so that you are able to traverse different subject areas, leveraging common dimensions, and then you get into things like hierarchy, management, and slowly changing dimensions. So those are kind of the big questions that, that we help our customers with all the time.

So, you know, the high level answer to the best practice for data warehouse organization would be, you know, tackle it from a kind of a conformed dimensions, bus matrix approach, but that takes a lot of analysis and it’s important to get that right. So you’re getting the right answers to your organization and people are numbers align, and people are adopting the solution and things like that. So those are the kinds of things the media questions that we tackle all the time.

Are able to There’s a couple of questions around being able to connect to Excel data sources sort of automatically or programmatically refresh those things, so, it what do you know about attaching to Excel files? And I think the idea is they’re thinking that this Excel file that they’ve dropped in for, say, a dataset or something is, dynamic area gets refreshed. And then, how do we handle that in in Cognos and datasets or modules?

Again, the default way is a manual approach to refreshing.

So, and if you have a lot of Excel files there, there may be probably a better way to do that than just kind of uploading those files. But if you right click, there, it is, a manual refresh. So, there may be some new features. And when that are 11.1, or that are addressing that, I haven’t seen that yet.

Got it. And there’s a question here about all that package is being used as a source. And I think what we heard there is that that’s not a capability that’s there right now, but it is on their roadmap. Without a, they didn’t give us any sort of timing on it.


So what happens with that is if you can actually connect to a dimensional source, it won’t function as it would in a report. Giving drill up, drill down capability, and you may encounter some error messages if the report is not really simple. So what you’re typically going to want to do, you can still use the data, but you would create a dataset from the dimensional source and, which, essentially, creates relational right? And then use that to, to feed your data module. So again, you get the data, but you don’t have that drill up, drill down functionality that you want if you connected directly to that in a report.

Got it. Thanks. And then, so there’s a good question about, how did you version control and migration to production, so sort of life cycle management?

Do you have any, I know, again, that’s kind of a big topic, but what capabilities are there, is that kind of, kind of less governed?

Yeah, I’d say, I want to see what’s available and new to kind of figure out what’s available with that, But that’s more when you’re getting into more of the governance than For now, I would say Framework manager is more capable of handling that, but I would say down the road that that could change or maybe is changing.

Got it.

Yeah, and I think that that sort of that leads to one of the other questions that I saw here in the, you guys are asking a lot of great questions. Oh, by the way, someone asked, you know, Kind of, what’s our opinion on the on the future of framework manager? You want to take a shot at that one?

Well, TBD is the, is the true answer. But I would say from what, I’ve seen in working with data modules now, there are so many great capabilities of it. But there are still some, they’re quite so much functionality and framework manager. That is specific to framework manager. That doesn’t exist.

So, it would depend largely on what you’re using it, what you’re doing, and framework manager, as far as creating those packages, just because there is so much functionality that is not yet part of data module.

So, as it progresses, perhaps it could, there could be more of a 50 50 split or something like that, but, I would say, you know, with the over 20 years behind framework manager, I don’t see it replacing it very soon.

Yeah, I think that’s, that’s spot on. And, of course, this is just our opinion, right?

But, I think it’s informed by a lot of interactions with customers and that, you know, I think we do see Cognos responding to what’s out in the market in terms of the Tableau? And Power BI is where they’re able to pull in flat files. And, you know, as long as I was with Cognos from for it for a decade, between 2002 and 2012. And they were even asking about bringing in Excel spreadsheets back then.

So, I think that’s always been something that users want to do, and they want to expand self-service in organizations.

And so, that, by necessity, sort of entails being able to do a lot of this stuff, without the use of necessarily fat clients, or having to use something as, like, framework manager. But she’s absolutely right, that it’s a 20 year old tool, and it’s very robust. So, I don’t see it going anywhere anytime soon. But, the use cases that will be covered by data modules and datasets I think will rapidly expand, and you’ll see more of a, maybe, a Venn diagram with a little more, A lot more overlap as they start to enhance that functionality.

Let’s see, anything else you see up there?


Yep, I think, I think those are kind of the key questions that we have here, and we are nearing the top of the hour here, so I think we will.

Well, I’m, like, I said, we’ll go through the question log here, and any questions we didn’t answer, we will seek to answer in the question log and post along with the deck. And the video, the video usually takes a week or two. We have to do a little bit editing on it before we presented, before we put it up, on the on the website, but it will definitely be up there. So, with that, I want to move to the last slide, and thank September for an excellent presentation on Data Module, so that was very helpful. We encourage you all to reach out to us with any questions you might have, any analytics needs you might have, or training needs. You can always reach us at our website, at [email protected] Or if you want to pick up the phone, we have AAA number there. We also encourage you to reach out to us and connect with us on social media via LinkedIn or slide share, YouTube, Twitter, and on Facebook. So, we thank you for joining us today. Appreciate your time and attention. And look forward to seeing you soon on another Senturus webinar. Thanks, everyone. Have a great rest of your day.