Our Migration Assistant programmatically decodes and inventories Cognos content, radically streamlining Cognos migrations and consolidations. Quickly cutting through 1000s of reports and packages, it surfaces needed details about lineage, relationships, usage, filters and parameters.
In this on-demand video see how the Migration Assistant gives you a serious leg up for
- Scoping data source changes
- Migrating analytics platforms
- Optimizing your Cognos footprint
Learn how the Migration Assistant eliminates the daunting, time-consuming manual audit process by swiftly organizing your complex, multi-layered Cognos content. See for yourself how you can dramatically accelerate your migration timeline.
In addition to BI consulting, Michael’s team created the Senturus Analytics Connector, which lets Tableau and Power BI use Cognos as a data source. Michael has been designing, delivering and selling analytics solutions for over 20 years. He comes to us from Oracle, IBM and SAP, where he spent over 20 years in different roles acquiring a wealth of hands-on, practical BI and Big Data experience.Read more
Greetings and welcome to the latest installation of our Senturus Knowledge Series.
I’m pleased to be presenting to you on the topic of how you can radically streamline you’re Cognos migrations and consolidations.
We always get a question early and often through the presentation.
And it’s: Can I get a copy of the presentation? And the answer is an unqualified.
Yes, To obtain the presentation, you can either go to the link shown in the presentation here, or you can follow the link in the chat.
And, while you’re there, make sure you bookmark that as the resource library has hundreds of great resources, including webinars like this, with recordings and their attendance presentations.
Today’s agenda, after some brief introductions, we’re going to do an overview of the Senturus migration assistant for Cognos.
After that, we’re going to go through three different use cases, specifically for the migration assistant, and then we’ll do a live demo.
Stick around after that for some great, additional free resources, and learn a little bit about Senturus, for those of you who may not be familiar with us, and after that, we’ll get into the Q&A. We encourage you to enter questions through the control panel.
In the questions section.
We mute the microphones out of consideration for our presenters. We generally are able to answer all the questions during the webinar and get to those at the end.
Anything we’re not able to answer, we’ll post in a written response document along with the recording and the deck on the Senturus website.
So, introductions today.
I’m pleased to be joined by two of my esteemed colleagues, Todd Schuman and Song Gao. Todd has over 19 years of business analytics experience across multiple industries.
He also regularly connects with the IBM product development team and it’s been in the forefront of the cognitive analytics upgrade and installation process since it debuted.
He lives with his wife and two daughters outside of Washington, DC in Virginia.
And, Song Gao is our lead developer and responsible for not only developing the Migration Assistant but also the Senturus Analytics Connector which allows Cognos data to be leveraged in Tableau and Power BI data visualizations.
Song will be present here, and answering questions in the chat and hails from New Jersey.
I’m holding up the West Coast, third leg of the team here. My name is Mike Weinhauer.
I’m a director here at Senturus, and among the many things I do, I run a lot of these webinars and today, I’ll be providing the core content.
As usual, we like to get our finger on the pulse of our audience and I’m going to launch a couple of polls here.
The first one being, What are your goals for Cognos in your organization?
And this is a multiple choice choose all that apply are you interested in changing your data sources?
So maybe you have a new data warehouse, or you’re moving to different databases, Whatever that might be. Are you moving to the cloud, or are you migrating to an entirely new BI platform either partially or entirely?
So maybe you’re adopting Tableau or Power BI, and you’re going to coexist or you’re moving wholesale that would qualify any one of those.
Are you just interested in cleaning up your Cognos environment and optimizing that and reducing your cost of ownership? I’ll close it out, there and share the results back.
So you should be able to see those, hopefully.
I’m not seeing it on my screen. But, if somebody can let me know.
It looks like about a third are changing data sources.
A third are moving to the cloud, half are migrating to a new platform, 52%, and 58%.
Almost two thirds are looking to clean up their environment and about 20% have some other reason.
All right, So, we’re going to move to the second poll here, which is, how many Cognos reports does your organization have?
Do you have a small number report, 0 to 500, 501, 2000, over one thousand Between our site, Between 1001 and 10,000, or over 10,000, or do you not know?
Hopefully you guys can see this, but I’ll talk through it anyway. At 10%, between 0 and 500, a quarter of you, 23%, between 500,000, and a big chunk, 54%, are over one thousand to 10,000.
While only about 10%, 9% of you have over 10,000 reports.
So that’s probably not the distribution I would expect.
Well, thank you for as usual, for providing those insights.
So, with that, let’s get into the core of the presentation.
So, the migration challenge, right?
That’s almost a redundant term and that migrations are inherently very challenging, So the good news is that with BI, if it’s done well, the complexity has been hidden from you.
So, if we kind of know, look through the various phases of a BI project or a BI implementation, you have, you go from Connect to, prepare, to, combine, and store, and model, and visualize, and then share out.
This is where your source data, and you’re moving through all these various phases to creating content, and ultimately getting to the end user.
If you’ve done it right, it looks pretty easy, and I’m sure all of you folks, on the line here who work in the trenches, you get an Excel spreadsheet, and they’re like, Well, this is pretty easy. You should be able to do that, right, and you immediately kind of got it.
Close your eyes, or roll your eyes, and realize that, you know, that, all the source systems back there, and everything you’d have to do to get that into that pretty.
Excel spreadsheet looking visualization is, is a non-trivial task.
So, you have cryptic source systems, you have huge data warehouses, tens, hundreds, thousands of tables, very complex transformations and logic involved in that elaborate data models, and then complex reports and dashboards. So there’s complexity at every stage there.
The problem with that, and the bad news is that it’s still there.
And when you and it only surfaces when you need to move it, right?
You have tends to 100 hundreds of packages, tons of data sources, thousands, or as we saw tens of thousands reports or more, you have multi-layer, duplicate, and overlapping reports and models that have just grown over the years, and nobody really understands them today, and they’re cumbersome.
So, over eight years of use, the environment becomes complex and messy, and it’s very tricky, right?
So, the end result is that the lift and shift approach, right, just taking what you have and moving it, it’s never really a good idea.
So when you move from one department to another, from one home to another, do you just pack everything up into the into the U-Haul and take it all with you? Maybe? But I really doubt you do.
You have a garage sale. You don’t have a bunch of things to goodwill, you throw things away. You kind of go through all your stuff, and clean it up, right?
You don’t want to be these guys here where he tries to little house worth the plywood or an entire house for the drywall on the front end of the forklift.
It’s not a good idea, and it’s a recipe for failure or at a minimum, some unhappy users on the other side at a sub optimal system.
So the alternative to that is, the Senturus migration process.
And so migrations are nothing new, right?
As long as there’s been software or BI software, there have been migrations, and we’ve been doing them for the entirety of our 20 plus years in existence.
And we’ve developed an effective methodology to approach migrations and make them successful.
In fact, if you visit the Resources page on Senturus.com, you can find other great webinars that go into more detail on our approach specifically.
Now I’m not going into these steps in particular, in delving into those.
Again, you can go see those webinars if you want to, but we’re going to focus, really, on the first three steps here, the Assess, Roadmap, and optimize steps that are helped by the Migration Assistant, that set you up for success.
Again, this is sort of, um, a brief summary of what those steps involved, and a lot of those what a lot of those steps upfront, or more business process re-engineering steps, a lot of which can be automated using the migration assistant.
And that involves things like elimination, where you’re ignoring or removing unused or broken items, identifying and removing duplicates, your planning and prioritizing high priority, or heavily impacted items.
Then you’re looking to simplify, streamline and consolidate, that, identify similar formats of reports, similar or identical calculations, similar filters, and then generalize, in other words, maybe provide a prompted report that will substitute out and eliminate 10 independent reports that are only subtly different from each other.
Then, the fourth step involves automating avoiding manual work and errors, and the first step involves delegation, So
Concoct or come up with Report Android Dashboard recipes, which reduced the need unicorns.
Now, the unicorns are these guys that understand both the platform that you’re on and the form you’re going to.
And that’s tricky, right?
You have very different architectures, different nomenclatures and approaches.
And it’s important that you have someone that understands both of those systems, but you need to use those people sparingly because they’re usually there, rare, like unicorns. And they’re harder to find.
So you want to be able to delegate that, and let people, that aren’t those unicorns to be able to operate efficiently and effectively.
A big thing we see are the compression factors of 10 to 20 X.
So for example, we had one customer that looked at 5000 reports and they were able to reduce those down to 500.
So, those of you who, you know, have looked at this stuff or have gone through, these are kind of, again, work in the trenches, I’m sure you’re nodding your head right now, going out there. There’s a lot of redundancy or similarity in the environment that can be streamlined, and that’s something that the Migration Assistant can help you with.
So, um, you have to unravel the data. This is a big deal.
You have to go into every single one of the reports individually to manually view lineage and filters and parameters. And, a ton of those reports are often sitting in the My folders, which are hidden from view for most people, right?
They’re there for the individual user, and that makes migration extremely difficult and time consuming, without using something like the Migration assistant.
You don’t have a consolidated view of what data sources and models are still needed, and, which ones are used to help you, again, with that sort of prioritization and, do I even bother bringing those over?
And, the Cognos audit data is limited, right? It’s limited in terms of what they give you out of the box, that it’s complex to navigate, and it’s hard to tell.
For example, you can’t tell which reports are never run.
You can tell which reports are run infrequently, but it doesn’t show you which reports are never run.
So, this is what these are sort of universal challenges that we saw out and are in the landscape, and with our clients and it inspired us to create the Migration Assistant.
So, what does it do?
effectively is programmatically decodes and inventories all those Cognos reports and packages into a database, which eliminates thousands of manual clicks required to uncover all the metadata, data fields, filters, calculations, etc, etc, etc, etc, and identifies important details such as your data source lineage relationships and usage. So, this sets you up again for success.
How does it work? Well, we pull the Cognos metadata breaks it down into smaller pieces.
So, it takes report, data, and metadata, or, sorry, report metadata, such as the pages, the components, the columns within it, the queries, the filters, all those pieces.
Breaks those down and pumps them into the database. It does the same thing for the Data model, including the namespace, as the query subjects, the query items, and the relationships, And it pulls in the data lineage between the reports, models, and the database.
It also extracts the report, execution history, and stores all of that in a SQL server database.
Analyze it allows you to analyse that Cognos content, either using ad hoc queries or, some included prebuilt reports that we’ll show you in just a little bit, that you can use out of the box.
It’s worth noting that the more well designed and complex, the model in Cognos, right, as far as oldest framework may get is it’s a very robust tool, And the best practices in data modeling suggests that you build a database layer, and then a business layer, and then a presentation layer, right? So you’ve automatically got three layers.
Other tools that you might consider migrating to, may not support any level of multi-layer modeling or they may only support one. For example, like Tableau has logical tables, but that really only gives you kind of one layer of abstraction. So the more well built your model. And the more complex it is, ironically, it makes it harder to migrate, oftentimes. So having this information at your fingertips as is, is really helpful.
So now, let’s talk about three common scenarios that we see the migration assistant being useful for.
The first one is data source changes, right?
So oftentimes, we’re seeing a lot of movement right now of data sources to the cloud, right?
You’ve got legacy database, kind of sit in the corner that’s tune up a lot of hardware in power, and your patch in the ALS and stuff like that.
You decide you’re going to move to snowflake, or at least put that thing up on AWS, that’s a big one.
Or maybe you have a new data warehouse, right? Again, oftentimes on a new system.
So the migration assistant here can help you identify the reports and the models that rely on that system, and it combines both that inventory, as well as usage statistics to help you prioritize and streamline.
It lists the tables and the columns that our use, And this is really important.
And as an interesting component here, that it allows you to search for specific keywords and queries and filters that allows you to do compatibility checking.
So, for example, you might have a function that is used in the legacy database that is different in a new database. And, I believe, That actually has, has, has a great example here. And Todd, which care to elaborate on that a little bit?
Yeah, I just wanted to add a little bit of color commentary.
We just recently finished up a project for a large bank that was deciding to maintain their Cognos investment as their primary reporting tool, but they wanted to move from their on premise Teradata database to a cloud based Google big query database. So there are a lot of challenges involved in doing this. But basically, we needed to, you know, find answers to a couple of questions.
You know, one wished reports for using Teradata as a data source.
Again, is this, I think it was like over 15 to 17,000, you know, reports, both. And, you know, the team content and the my content. My father’s area.
So, we had to scan through all of those, find, you know, what was the data source being used.
Once we identified those, what were the tables that were being used? So, just, because a model, you know, is pointing to a data source, You could have hundreds or thousands of tables hitting that source, But the report maybe only using 5 or 10.
So we were able to actually figure out, you know, what were the query subjects using. And pull back that information as, here’s the, the tables that are being used by a specific report.
And then we wanted to finally bring in the audit data, which reports have only been run in the last six months. And that became sort of the target of these reports that we need to move over.
And we can also sync that up with a database migration schedule.
Somebody could say, these 20, you know, 50 tables are going to get moved over this week, which reports are now able to be moved over.
And we can quickly identify, based on the information that we pulled out, the reports that could be moved over, and start running off of Google big query once that was done.
Another nice thing we were able to do is we identified functions that were targeted specific functions, like string manipulations, and date functions that were not compatible with Google big query. So, using that search keywords and in both the queries and filters, or identify individual reports, the query that had one of these functions, And actually, the query item, an expression that needed to be adjusted as well.
So we could quickly find them, change them to the new function, and make sure that, when we converted them over, there wasn’t going to be any, you know, years running them.
So it was, it was very easy to kind of, get that information with the tool, and it was a huge, huge timesaver.
I mean, when you think about it, really, until the last point, it helps identify and really planned that migration process and prioritize and streamline and run things in parallel track.
So, you had the reports coming no online, in line with the new data source, really enabling this customer to accurately scope, budget resources and manage that data source migration, which ultimately reduces risk, cost, and time, and better ensures a successful data source migration.
So, the second scenario is, you know, the big daddy migrating analytic platforms.
So, this, any of you who’ve been through this, this is huge and it could be migrating, you know, off one platform onto another in its entirety, or just migrating a portion of it, right? Maybe? or maybe you’ve gone through a merger and acquisition or there’s a use case or a department that you need to move?
It might even be migrating between major versions of a platform, right, you know, when tools like Cognos went from Cognos eight to Cognos 10 or 10 to 11?
Calling that an upgrade was you a bit of a stretch. And you could really kind of look at that as a migration, at a minimum, even if it wasn’t upgrade, you would go through many of those same steps are going, OK, well, I’m going to, I’m going to bring on this, all this new functionality, maybe there’s a good opportunity here to rationalize this and clean this up and do a lot of the things that we’ve talked about.
So here, the Migration Assistant can really help with several different items, too, know, help tackle all that greater complexity, cost, time, and risk, and ensure better ensure the success of your migration. So you’re able to identify the data sources, so you don’t have to, again, manually de tangle report and model details. You’re able to look at an inventory the report design elements.
So, this allows you to rapidly reproduce different layouts, and you’ll see that in in the demonstration, it shows you how to rebuild your data models.
It shows you how to rebuild reports, identifies the most used reports, and shows rarely used an unused reports.
So, again, kind of gives you the, the recipe for rebuilding models, rebuilding reports, Then, it allows you to identify the things that you, you should move over as a high priority, and things that are either lower priority, or maybe should be eliminate it.
Todd, I understand that that this is your favourite and ideal scenario. Do you care to add any color to this?
Yeah, I actually did a webinar, I think, earlier, this year, beginning of the year, that’s on the knowledge base, if you want to go check it out.
But, we, we did a couple of reports that we converted from Cognos to Power BI Live in the demo, but this, you know, this was something that I wish I had, you know, 20 years ago. I’ve been, you know, long time Cognos developer.
I know the pain and the time involved in trying to reverse engineer these Cognos reports, and he had to open it.
Each report, 1 by 1, navigate multiple pages. Click on an object. Find the properties of a query it uses. Go into the query, look at each individual query item, the expressions, the filters, and try to document all this, making these massive Excel spreadsheets. I’ve had to, you know, go through hundreds of these kind of, you know, planned migrations and just document this stuff. So, it just a huge time saver to kind of get all this just in a one screen calling, click that doesn’t even take into account, you know, FM models which, you know, Mike mentioned earlier. There’s typically multiple layers, you know, hundreds of tables, lots of complexity in there.
So, if you’re trying to recreate these reports on their tool, you know, you’re going to need someone who actually knows how to navigate. Cognos knows how to navigate from a manager and then, also probably has to be able to build them the new tools. You’re looking at these unicorns, Mike referred to earlier. You know, people who have that skillset, and it’s not that many people who do So with this tool. You can kind of just generate these, you know, blueprints, instruction guides of, here’s the tables, here’s the joins, here’s the expressions, here’s what you have to build. And it makes it just so simple to extract that information and present it and just start recreating things very quickly.
In a new platform, whether it’s Power BI Tableau or what have you.
Yeah, this is, you know, Mike BI migrations are maybe second only to maybe ERP migrations right there.
They’re big and complex and fraught with peril, so this tool really can help streamline that and reduce your risk cost and time spent on it.
So, you can accelerate the migration timeline while increasing the accuracy and your overall success rate.
Then, the last scenario that we’re going to go over here is, optimizing your Cognos Footprint.
So, here we see many dedicated Cognos customers that, you know, over time, again, the models get more complex.
They proliferate reports and copies or permutations of them tend to proliferate, and projects come and go and die off may gets and acquisitions take place.
Alright, it ends up creating kind of this crazy, cluttered Cognos environment, which is hard to maintain, kind of scares away the users, or makes it a little harder for them to do their job or. A lot harder to do their job. and ups your costs.
Because you’ve got to support all that, the hardware required to keep all that stuff online, and it’s just suboptimal.
So the Migration Assistant allows you to identify, some of these, rarely used unused reports, identify underused user licenses and do things like shrink metadata packages safely by removing some of these un referenced areas.
Now there are complimentary tools here that many of you probably own.
And, one of the key differences there is that we provide some of the insights, but acting on those insights requires either a manual intervention, IE, using your resources or consultants like Senturus or the use of these complimentary tools, which can often implement some of those changes.
So, what the Migration Assistant does here has helped to provide some deeper insights into that Cognos Audit data?
And, Todd, I understand that, that you found this helpful, as well.
Yes. So, this is another one of those things that I wish I had, you know, a long time ago been, as, you know, with doing a lot of the migrations. And upgrades.
You know, we wanted to bring over, you, know, just to the most use content and, know, that sometimes people don’t even have the standard audit turned on, which is, it makes it extremely difficult. We have no idea who’s running what.
Then, we had this scenario where we had people who have been auditing the report. So, they could see what’s being run, but, again, the audit database is, you know, on demand. So, if you haven’t run something no entries, get put into that database.
So, there’s usually a big no gap or a black box of just reports that we don’t know what’s out there. We don’t know if it’s ever been run, and there’s no real way to get that.
They introduced a tool called the Audit Extension, somewhere like an early, they tend to two-way thinker and like, $11, but it’s still out there, it’s not fully supported, and it doesn’t really work. Great, and you can kind of use that to kind of get the full view of the content store. But this is going to be great because it allows you to, you know, get that full list of every report, sync it up with an audit data and find out, you know, with outer joins, you know which ones don’t have any usage at all.
You also can get a look into the My folders area. That’s typically a big area of just junk and things that can get cleaned up.
You know, you’ll see no version V one, V two up to the V 100 of something that someone built that hasn’t been run in years or it was never run.
So, again, there’s just the value and just the speed of being able to go through massive amounts of reports, see what’s being run, see what’s never been run and really target your cleaning your clean-up efforts. It’s going to make this really valuable to anybody who’s looking to kind of clean up their Cognos environment.
Yeah, And this could even be, you know, these scenarios might be: You might be looking at multiples of these scenarios, Right. Or, this would be a subcomponent of migrating the platform.
So, again, what this enables you to do in your Cognos environment is reduce that cost of ownership by facilitating an environment that is clean, current, and compact.
We may also be able to shrink the hardware required and improve performance while reducing cost, and, again, you might help you with your adoption in terms of not scaring your user way users away and not having metadata that goes, no error reports that don’t run, or content that’s, or metadata that’s difficult to find.
So, with that, I’m going to jump over and I’m going to do a demonstration of the migration assistant, or more, specifically, the dashboards that we create out of the content of the Migration Assistant.
So let’s jump over to the screen here, and I’m going to maximize the view here.
So again, when we, when you come in and run the migration assistant, we choose the packages that you want to run that against. And in this case, so we’ve used a lot of the sample data, right?
And we’ve run this against four different packages.
And at any point, you guys have the ability to kind of focus or filter on one or more of those packages, or, for example, paths, right? So you’ve got team content or maybe my folders and things like that. You can kind of drill in on that and focus on it.
The, the results of, that, again, the content ends up in SQL server database that you can hit with any tool you want write SQL queries or write your own content. But we do provide this Power BI set of dashboards out of the box that you can use.
And it consists of four different dashboards, one of them being the overview that you’re looking at here and then we’ll go into the reports, executions and Relationships component later.
So, um, the top of the overview page here shows you all of the overall statistics from your environment, the selected packages.
So I can see that I’ve got, you know, eight packages here, 172 namespace is 223 subjects, etc, etc, through your query items, data sources, reports, and whatnot and then the pie chart shows me use versus unused. So this is pretty interesting in that you can see that that, wow, in terms of namespaces, inquiry, subjects, and query items, I’ve got a lot of stuff that’s unused here. So it tells me in my environment overall, it’s probably a really good candidate for some of that clean-up.
For example, if I’m migrating to a report to a new system, or migrating a data source behind the scenes, you can navigate this based upon whatever your specific needs are.
And all of these dashboards, you’ll notice share the commonality of interactivity.
So I’ve got various visualizations that all interact with each other.
And so, when I click on, for example, the parameter report, all the other tables and the pie charts filter down to just focus on this particular report.
So, if this was a particular report that I wanted to migrate, I can go and look at, for example, this is the metadata that it refers to the sales query namespace, the query subjects. And the tables.
And I can see that it was sourced from the Go Data Warehouse and the tables that it uses within the Data Warehouse.
Now, conversely, if I was doing something like what we’re doing, migrating a data source, I can pick a specific data source, like my data warehouse. And then it’s going to filter all of the KPIs and these tabular views, all these other visualizations to show me.
Here’s the metadata that flows from that data source, and all of the reports, along with the usage statistics. So in this example, you can see I have a lot of reports here that really have never been run.
So this, again, is meant to give you an overview of your environment, and allow you to explore the report and execution information, and kind of get an overview of your environment.
Now, if you’re drilling into the reports, you would go over to the Reports page.
Now again, this is sort of the thing that gives you that decoder ring that lets you have rip rebuild reports easily.
So if I look at example, for example, Todd report here, I can see that that report consists of three queries.
It has three pages on those pages. It has all these different data type containers. So these are your lists, and cross tabs and pie charts and maps. And I can see what the container type was. And the report query that is referenced there.
I can also see all of the queries that are in that report. And what query items. And what those expressions are. So this includes any calculations. Filters are shown here, and parameters.
So if I go look at customer satisfaction returns, instead, that 1 has 1 page.
two queries uses a bunch of combination charts.
And you can see, it has some very specific query item, expressions in it, and it has some prompts in here that I may need to replicate in the new environment.
So, instead of having to go into that report and look through it and open up each object in Report Studio.
So, all of you who know Report Studio have worked with Cognos You know how much it would take to go in here and kind of identify all those pieces?
Much less than extricated, you would probably sit there with two monitors, and a big Excel spreadsheet, and you’d have one open while you’re doing it in the other environment.
And, you would need someone that knows both of those environments.
A, regular person.
A non unicorn person, someone who knows the destination platform, could go in here and reasonably understand that I need to filter this by product survey, 7, 6, 6, 7, 7, 6, 6, 1. And then I need to filter on the region.
So this is particularly useful, for example, when you’re when you have maybe very cryptic names in your database. So imagine you had JD Edwards, where your tables are named F triple 05 and FO 901.
And those have been covered in the, in the metadata layer and given friendly names.
It’s helpful for really kind of identifying your friendly column names and what they really are down underneath in that data source.
We also have search capability here, and this is where, right, and this is just generic to Power BI, but it allows you to search through the metadata to look for. Again, kind of what Todd talked about, some of those specific functions and things that where you might be doing, those compatibility checks.
Alright, looking at executions.
Again, you have all the packages and paths and you’re able to select who, you know, wherever those whichever ones you want to focus on.
We run for a specific data range and you can filter this based upon your desired visualization time window.
And you can see all of the reports when they were run, when they were last run, how many users ran them and how many times they were run.
And as we mentioned before, you can see all the reports that were never run. So, again, good candidates for rationalization or elimination.
And we can see as I sort of drill into some of these.
I see the users and when they ran them and how many times they ran them.
So I can see that this report may be important to a few specific users. Oh, you know what? I have to go back, I’m sorry I do this every single time.
When I’m looking at a particular a report, for example, we also include the lineage and this is really important. So this is what I was referring to when I was talking about the JD. Edwards in particular.
So I can pick the product line column and I can see that the product line column, where it sort of traverses those various layers in them In the FM model.
So that’s the business friendly layer.
You can see Business View, and I can see this goes down to the database view and ultimately comes from this column, Product line E N, So that lineage information is a really nice sort of augmentation.
That allows me to see, again, where that friendly name came from and what it’s potentially cryptic underlying name really is. Sorry about that. I missed it. The first pass.
Then the last dashboard we have here is the relationship’s page, and it shows how each data model was constructed.
And, again, I can sort of filter down on a specific package in order to look at just its components, and I’m able to look at the entire view, the Model view, or the database view, and filter in on any one of those particular items.
And this is helpful for, again, when I’m trying to replicate a model in a new platform.
And it allows me to see the model view and the database view.
It’s particularly helpful when you have more complex query subjects, like, for example, country.
When I pick that, I can see, here’s the query subject. Here is the cardinality as it relates to the query subjects. It’s joined to as a right side relationship, branch conversion rate, currency lookup. And I can see the relationship expression.
I can also see the tables that are connected to it, and how those relate to each other, including the, the relationship criteria.
This is particularly useful when you have some complex SQL potentially in your model, This will surface that.
So, if you have recursive SQL macros parameters, things like that, this will surface all that information.
So, again, something like product line or, sorry, the product dimension, I can see that that has a lot of complexity to it and is related to a lot of different tables.
And this really gives me the recipe for how to rebuild that metadata in a new environment.
OK, so, hopefully, that gave you a good flavor for what the Migration Assistant and the attendant dashboards do for you, you know, coupled with all the great information Todd gave you here.
It shows you how we’ve created this both based on our client needs and our client requirements and how we’ve used that in client environments to really optimize migrations.
So, some, a few technical requirements here, so, the system requirements does require Cognos Tennant above: We do support OEM versions of Cognos.
You do need a version of SQL Server and SMS, 2016 or higher, that’s that content database that we pull the contents store information into.
And, if you want to use our Pre-built Dashboard, then you need a copy of Power BI Desktop, and you can use the free one if you don’t have buy Power BI Pro or Premium.
And if you’re sort of seeing this and going, Wow, does this make sense for, For me?
It’s bundled with Senturus installation to configure configuration services and we can tailor it to your use case and environment, and we see that it delivers ROI for organizations that tend to have more than 20 models, or more than 250 reports, right?
So, if you have less than that, then the value may not be as as great for you, But, again, it depends, right?
Maybe you’ve got very complex models and very complex reports, in which case, it could be beneficial to you.
Regardless, you can learn more on our website at Senturus.com or just contact us at [email protected] and we’ll talk to you about it free of charge, and figure out if it makes sense.
So, stick around, we have a Q&A at the end of these couple of slides here.
We do have hundreds of free resources on our website, We have been committed to sharing expertise for over a decade..
You can also see recordings of other webinars like this, as well as the presentations or blogs, and all of our great upcoming events. Speaking of upcoming events, we have 2, 1, using Python with Power BI that will be with Patrick Powers.
On Thursday, October 14th, 11 AM Pacific, or webinars are generally almost always at 11 AM Pacific, 2 PM Eastern, you can register at the link there and we’re doing What’s New in Cognos 11. 2.1 and that’ll be with the Cognos Analytics Offering Management Leader.on October 28th.
At Senturus, we concentrate our expertise on business intelligence solely with a depth of knowledge across the entire BI stack.
Our clients know us for providing clarity from the chaos of complex business requirements, disparate data sources is constantly moving targets, and ever changing regulatory environments.
We’ve made a name for ourselves because of our strength in bridging the gap between IT and business users.
We deliver you solutions that give you access to reliable analysis, ready data across your organization, enabling you to quickly and easily get answers to the point of impact, in the form of the decisions you make and the actions you take.
We offer a full spectrum of BI service services. Our consultants are leading experts in the field of analytics with years of pragmatic, real-world expertise, and experience advancing the state-of-the-art.
We’re so confident in our team and the Senturus methodologies that we back our projects with a 100% money back guarantee that is unique in the industry.
In addition to services, we provide a complete spectrum of BI training in the top three BI platforms. We support Power BI, Tableau and IBM Cognos.
We are ideal for organizations that are running multiple platforms, or those moving from one to another.
So, we have a lot of people that know, two, if not, offer these platforms very well, and can kind of are multilingual, if you will.
We provide training and many modes, you can see we have group session, small group mentoring, instructor led online courses and self-paced e-learning.
And we’ve been doing this for a very long time or two decades now. And we’ve worked across the spectrum from Fortune 500 to mid-market, probably in many, if not, all of those logos.
And have solved business problems across many functional areas, including finance, sales, and marketing, manufacturing, operations, HR, and IT.
Our team is large enough to meet all of your business analytics needs, yet small enough to provide very personalized attention.
If you think you’d like to join Senturus, one of those unicorns out there, we are hiring. You can see the positions there. You can see the job descriptions at the link provided there.
You can email us, send your resume to [email protected]
And with that, we have plenty of time for some questions here.
I’m just pulling up the question log here.
Is there a question you guys want to tackle first?
Along us and Senturus has been around 20 plus years. How do you go from Cognos to a Redshift model?
So, I would say, then, you’d be looking at your Cognos model and trying to get that the database information there.
And, really, what’s perhaps more important there?
I assumed, by Redshift, you’re meaning Amazon Redshift, so, that’s more of a database migration and I’d ask you the question: What’s the database that you’re using with Cognos right now? Right? Cognos isn’t a database itself. It’s pointing to something.
So, we would want to understand what that database is, and it’s an Oracle Data warehouse, there you go.
So, if you’re trying to move from an Oracle data warehouse to Redshift, then we would potentially use the Migration Assistant to look at your source database and how you model that and help you migrate that to the, to that Amazon Redshift environment.
So that’s akin to, you know, the, the customer that Todd was talking about earlier, where they were going from a legacy data source, too.
A different data source. Google big query.
Yeah, exactly. That’s, that’s scenario one that we talked about. I’m assuming you’re, you know, tables and column names are going to be maintained the same. Then you could use that scenario to say, which are the reports that are using? Oracle? What are the tables? And these migrations take time. You know, if you want to do it Monday, you could just do it that way. Or if you want to chunk it up and say, you know, let’s move over 100 tables. Let’s find the tables in the reports and the sources that are using those and knew them overwintering, you know buckets. You could do that do it that way. But there are some, you know, primary questions and things to, you know, make sure you have in place before you do that.
Yeah, thanks, Todd.
So we had some of the questions I got answered here. I’m going to try to answer some of those. So the migration is just like cognitive product, or a centrist product. It is a centrist product.
And let’s see, somebody asked if we support data module. So, we do not support data modules currently, That’s up for an enhancement, look for that in the not too distant future.
If that’s a critical requirement for you, again, you know, reach out to us, and our development priorities are often driven by what our customers’ specific needs are.
Let’s see, what about users with my folders? So, again, the important thing is that the Migration Assistant does surface the My Folders information.
And that’s something that’s really hard to get out if you don’t have a tool to do it, because you’d have to login as those individual users.
So, we do surface that information.
So, you’re able to interrogate figure out what reports are there and that’s where you find that Oh, Hey, Bob has the same report. It has that Sally has or You know, Bob has a different parameter in their Bobby’s is across to have it tell you is is a list.
And you’re able to rationalize those and really kind of optimize that.
Make sure you’re migrating stuff and you don’t just go and delete it because, Oh, my god, that report. Maybe it wasn’t used in the team content, but it’s heavily used in the my folders in a subtle permutation.
So, there’s a lot of great stuff that comes from being able to surface that my folder information.
A question about having a hosted Cognos environment: How can we use the app?
Let’s see, I was going to try to answer that but I may be getting out over my skis too much when you guys want to take that one.
It would, it would depend on can we get access to your contents or audit database?
So if those are accessible or they could be or you could get a backup of those and put it somewhere where we could run the scripts against it, that will work. But if it’s going to be completely walled off, you know, like if it’s IBM hosted or one of these ones where you actually never have physical access to the server, then it would not probably be something we could do?
Great, thanks, Todd. And, there’s a question here. Can you differentiate old query studio reports from a new version of Cognos report using the Migration Assistant?
I definitely can parse them as to whether it identifies them as Queries Studio or Report Studio. I have to check and see.
I think there’s probably a way that you can do that, but essentially, query see reports are the same thing on the backend. You know, there’s Cargoes, 10, you kind of had Fort, Studio Query, Studio, Workspace Advanced, and all these different studios that were essentially different front ends on sort of the same thing.
So yeah, I mean we could definitely parse them.
I would just need to see if there’s some kind of indicator that we, we can pull out of there or if maybe we already are doing that. That just tells you that this was a query, STD reporters was a report city report.
Alright, thanks, there’s, someone asked the question, is there a class for the Migration Assistant?
We don’t have a class, as I mentioned earlier, it’s bundled with services. Because it’s designed to be used by our consultants.
It’s not a shrink wrap software, product sell.
No, that said if we’re working with you, certainly you can look over our shoulder and we’ll show you what we’re doing.
And can you take Framework may get query and logical models and move them to Redshift?
So, yes, you know, as I mentioned earlier, you wouldn’t need to, when you’re migrating a data source, the trick isn’t so much, They, the abstraction layers, It’s that database layer, and they can potentially be, or will likely be very different between a legacy Oracle data warehouse and a modern cloud based Redshift data warehouse.
Do we have training in the Bay Area, so, again, we don’t have a class for the Migration Assisted, but, yes, we do offer our training.
We do predominantly private training, so we, we come to you or we do, We do offer some Online courses are, again, mentoring, we can do it with you. At your organization or online. Again, that’s a good one to reach out to us and fair. What, what we’re, what your specific needs are.
OK, well, no one ever complain about getting a few minutes of their day back, So we’ll give you back about 11 minutes, so thank you to all of you for taking time out of your undoubtedly busy days to join us here today.
Thank you to my colleagues, Todd and Song, for helping me out on this webinar. And, again, thank you for joining us. We look forward to hearing from you, or any of your migration or other BI needs.
And we look forward to seeing you on our next Knowledge Series event.