Having Cognos operate independently from Power BI or Tableau defeats the goal of modern BI: fast, unified decision making based on secure, accurate data. The lack of integration is costing companies time-to-action and competitiveness.
In this on demand webinar, you’ll learn about the Senturus Analytics Connector, an easy drag-and-drop solution for integrating these powerful and distinct tools. We show you how the Analytics Connector allows Power BI and Tableau to easily tap the secure, curated data in Cognos. It’s a powerful modern BI pairing both IT and business users can appreciate.
- Simple drag-and-drop functionality that eliminates time-consuming data remodeling efforts.
- How the Analytics Connector results in improved security, data accuracy and overall Power BI/Tableau performance.
- Features that further reduce query waiting times and boost performance.
- BONUS: Our new tool that instantly migrates Cognos reports to Power BI!
Lead Software Developer
Arturo brings close to a decade of software development experience—plying his craft in the restaurant and office management sectors along with healthcare and blockchain space. After working with a wide array of software products, he’s now the Lead Developer of the Senturus Analytics Connector.
VP Software & Architecture
Bob Looney leads software development and BI architecture efforts at Senturus, focusing on software and cloud architecture with Snowflake, Power BI and Tableau practices. Before coming to Senturus, Bob was designing, building and scaling business intelligence software, reports and dashboards for use by thousands of restaurants.Read more
Hello everyone, and welcome to today’s Senturus webinar on dragging and dropping Cognos data into Power BI or Tableau.
Thanks for joining us today.
It’s always great to have you here.
For our webinars, a quick overview of today’s agenda.
We are first, we’ll do introductions here.
We actually have two presenters today in addition to me doing Kylie intro and outro.
So we’ll do some quick introductions.
We’re going to cover a quick overview of the Senturus Analytics Connector.
I will do a demo of the connector itself.
Then we have a special sneak peek of a new product, our reports.
The Migrator will cover some frequently asked questions, general overview of Senturus and some additional resources that are available on our website.
And as I said earlier, we’ll do some Q&A at the end of the session.
So some quick introductions.
Our first presenter today will be Arturo Ayala Arturo’s, our lead software developer here at Senturus.
Arturo brings nearly a decade of software development experience, buying his craft in the restaurant and office management sectors along with healthcare and blockchain Prior to joining us here.
After working with a wide array of software products, he’s now the lead developer for our Senturus Analytics connector.
And we also have here today Bob Looney, our VP of Software Engineering.
Bob lead Software Development and BI architecture efforts here at Senturus focuses on cloud software or focuses on software and cloud architecture with Snowflake, Power, BI and Tableau.
Before coming to Senturus, Bob was designing, building and scaling BI software reports on dashboards in the restaurant industry.
As for me, I’m Steve Reed Pittman, Director of Enterprise Architecture and Engineering.
And as usual, I’ll be doing intros and outros here, but Arturo and Bob are our guys for the meet of today’s presentation.
Before we get started, we’re going to do just a couple of quick polls.
Let me get my poll window up here.
The first poll is just to get a sense of which BI platform or platforms your organizations are currently using.
Let me go ahead and get that started, right.
It looks like most of you have answered the poll, so I’m going to go ahead and close that out and let me share the results here, so everybody can see that.
So again, almost all of you using Cognos, not surprising, we ended up at about 57% it looks like on Power BI, 43% Tableau.
So not surprisingly a lot of hybrid environments out there looks like.
So let’s go ahead and stop sharing those results and we’ll go on to the second poll.
And our second poll is just what are your future plans with Cognos.
So our trophy’d go ahead and switch to the next slide and let me go ahead and kick this off.
So one option is keep Cognos for the long term.
There certainly are plenty of folks doing that.
Some folks keeping Cognos in the short term, but not sure about the long term.
Some folks intend to completely move away from Cognos in the future and some folks are somewhere already along that path in terms of migrating off of Cognos into a newer platform.
So looks like we’ve got about 40% of you planning to just stay on Cognos for the long term, About 1/4 of you, not sure about the long term, but certainly staying on Cognos for now.
And combined, about 30% of you, 35% of you either are already migrating or planning to migrate.
So let’s go ahead and stop sharing that.
And with that, I am going to go ahead and turn it over to Arturo to talk to us about the analytics connector.
So Arturo, take it away.
Hope I sound OK.
Hey everyone, my name is Arturo.
I’m happy to be here to discuss the Analytics Connector and its role in your business intelligence efforts.
I want to start by spending some time looking at the inner workings of Cognos.
This is a somewhat simplified, oversimplified diagram, but it’s going to help us identify the pain point of getting data out of Cognos.
On the left, we have our data sources.
This is usually a Sequel or Oracle server, but it all depends on what technologies you’ve adopted.
Let’s call this our hard data.
And our hard data exists apart from Cognos, but it’s completely necessary in a Cognos setup.
You know, without data we can’t build our models and we can’t build our reports without our models.
So the right of our hard data, we have our FM models, power cubes or data modules, and these are entities based on our hard data.
They essentially extend our hard data, right?
And these eventually become what Cognos calls a package.
And a package is just an extra layer of logic on top of our hard data.
We can refer to it as a metadata layer.
And that metadata layer can be as simple or complex as we want it to be.
And disclaimer, it’s usually pretty complex.
To the right of our models, we take that metadata layer and we create reports in Cognos analytics.
You know, we use our packages to build our reports and dashboards.
Now, our package or metadata layer only lives in Cognos, and on the surface it’s not an issue.
Now, the moment we want to use Power BI or Tableau, this becomes a problem.
Now, there’s different reasons why an organization will want to use these tools.
Some may need specific functionality they can’t get from Cognos, some may want to move away from Cognos altogether, and others may just want to leverage their existing licensing or expertise with a specific tool.
Whatever the reason may be, the fact that our metadata layer is stuck in Cognos really holds us back now.
I think everyone can agree they’ve invested a lot into their Cognos models.
This is a screenshot of Framework Manager where our models and metadata are created.
You or someone in your organization has spent a lot of time here.
I guarantee it.
The job takes a deep knowledge of the data and lots of validation to make sure the models are up to par.
After all, reports are based on this data and business decisions are based on reports.
I mentioned how the metadata layer can be as simple or complex as we need it to be.
Here are some more screenshots of Framework Manager.
As you can see, there’s a lot going on.
Relationships are being made, new calculations are being added, points of data are being transformed into something meaningful, and this is the data you want to work with when building your reports and dashboards.
And it’s the same data that proves elusive in Power BI and Tableau.
And the reason is because Cognos has a firm hold of it.
That metadata layer that transforms our hard data belongs to Cognos.
If we wanted it in any other tool, we have to recreate it.
And that comes with its own challenges.
First, it’s a big lift.
The more complex your data model, the bigger the effort to recreate it.
If you do end up recreating it, you now have at least two sources of truth that need maintaining.
And that’s not desirable.
You really just want a single source of truth.
All this takes time and money and effort and you may not be in a position to offer any of those.
And Cognos has built in security that may make you compliant with some security standard, and all that is out the window when you’re looking outside of Cognos.
Thankfully, we here, Senturus have done something about this problem and we have two different products that can help in these kinds of situations.
First, the Analytics connector.
It’s a great solution for those that are exploring new tools and may want to coexist with Cognos and Tableau or Power BI.
It acts as a bridge from your existing models and Cognos to Power BI or Tableau so you can leverage the data you trust and the tool you want.
8 minutes 29 seconds
Setup and configuration is quick and you can start building reports in no time.
On top of that, any security you’ve built in is still enforced and all your audit trails continue to be kept.
Now shift gears over here to the.
It’s a brand new tool that also helps you get data out of Cognos but by different means.
This tool converts your Cognos models and reports into the Power BI data sets and reports and it’s more it’s meant more for the user that sees the future without Cognos.
Now I know some of you may be here specifically for the Report InstaMigrator, but bear with me as I go over the Analytics Connector first.
Bob will then expand more on this new tool in the second-half of the presentation.
So let’s get into the analytics connector.
By now we have a clearer understanding of our problem.
We need our Cognos data in a different tool and there’s different approaches to solving this problem.
Here we have two popular approaches.
Let’s start off with the source data and then going down and to the right to our end user.
This approach is just simply accessing the databases driving our Cognos models via your favorite BI tool.
So that hard data that we talked about, right.
The problem with this is that any logic you built in Cognos is disregarded.
That metadata layer is just it’s not taken into account here.
Also, security is bypassed.
Any Cognos security, unless it’s built at the database layer, won’t be taken into account here either.
Basically you would have to recreate some of that metadata or that security to make this viable option.
Now another approach that starts here with the Cognos logo and moves to the right with that CSV dump is to use Cognos as a sort of ETL tool and extract any needed information via a CSV dump and that usually comes in the in the form of a report.
An interesting point is that we found that some clients top five reports are solely for this purpose.
Now in my opinion this is probably the best strategy because you’re not circumventing any of those Cognos models, any of that Cognos security.
But it’s hard to automate.
You can maybe automate the CSV generation, but then you have to upload that CSV file and that’s manual step and anytime you want to refresh the data, you have to create the CSV file and then upload it again.
And there’s actually some security violations here.
So take for example, user A creates the CSV file, right?
That’s CSV dump.
And they upload that file to, you know, OneDrive or Dropbox, you know, some central repository that everyone can access.
So user A has access.
Let’s say, for example, we’re talking about cells, you know, and have access to cells from all regions.
Now user B comes along and says, hey, I want to create a report based on cells.
User A says, yeah, go ahead on the files in Dropbox, You know, knock yourself out.
User B only has access to, let’s say, the North and South America region.
If they access that CSV file that User A created, they now have access to everything.
Now, it might be a somewhat trivial example, but you can see my point.
You know, there’s a security breach there and you never want that.
Basically, you know, the problem with these approaches is that we’re circumventing that metadata and security layer one way or another.
Now, you could recreate the metadata in your favorite tool, but why should you?
While it may be your only option in some scenarios, you generally don’t want to go down this route.
For one, it compromises data integrity.
These remodeling efforts are a big lift.
The bigger they are, the more error prone they are because of everything that’s involved.
You also compromise data security.
Any of that security you built in.
Cognos authorization is no longer counted on.
It’s also time consuming.
You’re essentially reinventing the wheel, doing something that you’ve already done before.
And more dangerously, it can all sometimes lead to data silos.
You know, one user may have one data set and another may have another, and after months of work their data sets may deviate, you know and to the point where they get they reach different conclusions and we never want that either.
So the question becomes, how do we do it?
Cognos is our source of truth, so let’s keep it that way.
And it’s something that can be achieved with the Analytics connector.
A user can access their data in a way similar to that of Report Studio and Cognos on Analytics, but leveraging their favorite BI tool and the expertise they have in that ecosystem.
Now this screenshot contains both Cognos Report Studio on the left and Tableau on the right.
In Report Studio, you see we have our Go sales query package expanded.
Then you can see that this breakdown has different query subjects and query items in Tableau.
Here on the right you can see how we have the same package selected and you can see that there’s a match between that sales query, query subject and the sales query table.
Over here on Tableau, we can see how our packages, query subjects, query items, measures, and dimensions become databases, tables, and columns.
On the Tableau side, it’s very easy to see your data the way you expect it, no?
So all that effort that went into your Cognos models is persisted across reporting tools.
Basically, all the hard work you did in Cognos Framework Manager isn’t lost when using the connector.
With the connector serving as your bridge to Cognos, you can start adopting the reporting tools you want with the data models you need.
And I’ll be performing a demo a bit showing you how easy it is to use the connector.
But before I get into the demo, I want to recap the important points of using the connector.
For one, you don’t have to remodel your data.
You get to leverage your existing models, and because of that, you can be confident that the data you’re dealing with is accurate.
You can keep depending on your security measures because they’re persistent in whatever tool you use.
And like I mentioned earlier, installing and configuring the connector is quick.
You can start building reports that you trust right away.
Lastly, you can ensure your user base is accessing one single source of truth.
That’s always a good thing.
Before I get into the demo, I just want to quickly mention that anything you create in Power BI and Tableau using the analytics connector can be shared across your organization using either the Power BI service or Tableau Server.
Using the connector, you can create a data set that originates from Cognos, and using Tableau Server or the Power BI service, you can expose that same data set to the rest of your team.
You know, we talked about not sacrificing integrity and security for Agility in the last slide.
The fact that you can publish these data sets is huge.
It gives your users instant access to your Cognos data and the environment they’re accustomed to.
All right, so now we can get into the demo.
Let me put PowerPoint We’re going to start off with Power BI.
I’m just going to create a simple report and show you how the connector works with the Power BI service.
I am going to skip some steps just for the sake of brevity, like logging in and selecting the tables that we want to interact with, but these are the tables that we’re going to be working with.
Album, Artist and Artist top cities first, so let’s get a list of the artists that we’re working with them.
This is recording artist information.
The sample package that we created here is Senturus.
I’m going to create a slicer so we can filter on artists in a bit here.
So that artist’s name right here.
So we have our list of artists and we do have sales information, so let’s bring some of that sales information here.
Start off with artist, album name, Let’s do track counts and total sales.
Let me expand this a bit so you can see we have a list of all albums and how much sales they did.
This is all the albums are real, the track count is real, the sales information is made-up.
So don’t quote this just in case Black Eyed Peas is your favorite artist.
Now let’s add a map here.
Just to get a little bit more visual.
Just add the map up here.
We’ll do that city here and this bubble size will coincide with the amount of listeners in each city.
Now, what I want to get across in this demo isn’t the fact that we can build this report with this nice visual.
What I want to get across is the data that we’re dealing with is the actual data we see in Cognos.
So we can trust it, right.
So I have the same report here in Cognos, believe it’s this one.
And I want to compare some of the fields.
These are summated fields, for example, this track count.
So here we have 1065 tracks and in Cognos, same thing for this total sales we have 486 million and you can see that these numbers align.
What this means is that we can trust the data here in Power BI.
The connector is just pulling the data you see in Cognos and it’s putting it here in Power BI and that’s what we want, right.
Let’s do a little bit more checking.
Let’s see something here in this map, seller LA has 2 million, almost 3 million listeners.
Let’s see if that matches up over here.
And that is I think the same number.
So what I want to get across here is that we can trust this data.
And as you can see, building the report is pretty quick.
Now I’m just going to publish this report here to the Power BI service.
Yes, I want to save my changes and publish it to my workspace.
And as this is turning, Power BI does something interesting.
It publishes your report and it also publishes your data set.
So here in the Power BI service we’ll be able to see both of those things.
So it opens our report 1st I believe.
And once this renders, you’ll see that it’s the same report that we built but and the web.
And you can share this report, but I want to show you the data set that it generates.
Because with this data set we can actually create a new report.
So you can publish this data set and then someone else can come in and start making a report that fits their needs, right?
So I think this is pretty cool.
Now I’m going to do the same thing but in Tableau, just so you can see how the connector works seamlessly with both of these products.
Let me close this actually and open it again just so it renders a little bit better.
That was a little small.
Now something I want to showcase on the Tableau side is our security.
So actually I forgot to show you guys over here on the Power BI side, I’m going to show you the account that we used to log in.
And just a side note, the credentials you use here in Power BI are the same credentials you use in Cognos Analytics.
So here you can see I logged in with this Cognos embed account and just look at the artist here because the account that I’m going to use to log in over here on the Tableau side is Cognos Embed 2.
Now I added a data security filter and framework manager that reduces the amount of artists this account has access to.
So we should only see artists whose name starts with a T So yeah.
So here we have same Impala and the outfield, and I’m going to recreate that same table we made in Power BI.
That album name I added track counts and sales.
Let me make this a dollar figure.
So here we have the same album breakdown, but only for these two artists.
Now let’s make sure that this data aligns with what we see in Cognos.
And I do have another report in Cognos here with only Tame and Paula in the outfield.
So you can see we have 199 tracks and around $112 million of cells.
We can see that these numbers match up.
So again, the connector is pulling in that trusted data that we can rely on.
But you’ll have to trust me, once we publish this data source, we can do the same thing that we did in Power BI and that’s start creating another report with that same data set.
So I’ll owe you guys that presentation.
Let’s go back to the PowerPoint presentation here and talk about return of investment.
First and foremost, you save time.
You know you’re not recreating metadata, you’re not spending time converting reports outside of Cognos, and you’re not having to maintain that metadata layer between tools.
They’re also reducing a data security risk.
Any security measures you’ve baked into Cognos to get you into compliance with different security standards or just making sure the right people have access to the right data are still in place and gives you Peace of Mind?
You’re also maximizing your Cognos investment.
You know Cognos is your source of truth and you’re still leveraging that truth.
Lastly, you ensure accurate and aligned business metrics.
Remember, no one needed to remodel your data.
You’re still using your tried and true Cognos models and everyone is going to be on the same page because of it.
We’ve reached the end of the analytics connector portion of the presentation and I hope it was informative.
For any of those who are interested in trying out the connector, we do offer a free 30 day trial.
We’ll help you install and configure the connector and answer any questions that may arise during the trial.
So please reach out if there’s if that’s something you’d like to do.
Also, if you have any specific questions about the how the connector works.
I know I went to the demo pretty quickly, but you can send us an e-mail or submit a question for our Q&A session.
With that said, Bob, if you’re ready to, I’m ready.
All right, I’ll go.
Thank you for that.
And now we’re going to get into a little bit of a sneak peek of a product that our labs group here at Senturus has been working on for a while, the Senturus Report InstaMigrator.
So where this product has come from is that we talked to a lot of folks who say, hey the decision has been made.
You know they’ve already gotten past this decision point of we’re moving from Cognos to Power BI.
Maybe it’s executives, you know top down this decision already came or it’s a merger situation where you know you’re consolidating on one tech stack.
But regardless of where that kind of decision originated, decisions been made.
So now we got to figure out how to make this happen and what the team that’s been tasked to do this migration starts to realize is like.
This is a huge project.
There are complex data models as our true kind of talk through thousands of reports, security, license renewals, timing, project management of this whole complex process of moving from one system to another.
And we see tend to see a lot of kind of paralysis at that point of like we don’t even know how to start this project, we don’t know how to estimate it, scope it, start putting numbers and budget requests together.
And so that’s a lot of where this came from.
And it’s, you know, people are like, well isn’t they’re just a tool or a magic button that we can push and get all of our content from, from Cognos into Power BI in this this case.
And so we have finally kind of cracked some code with some new features available in Power BI to create an accelerator that literally is that magic button moving forward.
Next slide, thank you.
So we are mostly going to just do a product demo today.
We will have a full webinar about this product in the future, but kind of a sneak peek demo.
What we’re going to show off today is migrating A Cognos model into a Power BI data set kind of using our automated tool and publishing that data set to Power BI service.
Then we’re also going to pick a couple reports and migrate those reports over as well and hook up to that published data set and Power BI service.
And then lastly, we’ve built in an integration with GitHub so that we have now kind of a great project tracking system and future DevOps system for your BI team to leverage as they continue to Polish up reports and data models and move things into production and then ongoing you know evolutionary changes and improvements to those reports and data models.
So with that, I’m going to steal the screen share here and we will get going with the demo, all right.
So in Cognos, we have built a couple of sample reports that we’re going to migrate today.
There is this two page record report.
So some of our data model is going to look similar to what Arturo is just showing off.
And you can see this first report has two visuals on this first page, one visual on the second page, Nothing too complex, but just a couple different things at play.
And then the other report we’re going to move is literally called a simple record report.
It’s just a straight dump of data from our FM model data source.
So with that, we are going to try and land those 2 reports and a new Power BI data set and this workspace over in Power BI, excuse me, too many conflicting terms.
So to do that, we are going to use our new product here, the Senturus Report InstaMigrator.
And to begin, we’re going to connect to a local database where we’ve extracted a bunch of the information from the Cognos content store and done some post processing on that content and we get our list of packages that were in the content store that we imported.
So I’m going to filter this down to just the record package, which is the sample package we’re converting today.
And then similarly we see our, you know hundreds if not thousands of reports in our system.
In this today’s purposes, we’re just going to grab these first two here, the simple record report we just saw and the two page record report and we get this little summary statement here of you know we’re going to convert 1 package and 2 reports and we get an output location on our local machine.
Here it’s going to be this C conversion utility and we’re going to click the button and it’s actually really, really fast and see it generated this output folder.
Under that we have datasets.
So these will be the new Power BI datasets.
So there’s a record package which was the package from Cognos.
And then our two reports, there’s a folder for the simple record report, the two page record report, and we’ll come back to those in a second.
So let’s open up this record package.
And this is a new Power BI project format, which is the reason that we’re able to do a lot of what we’re showing off today.
So as you see, when it comes up, it throws us warnings that basically it doesn’t know anything about the data sources because it’s never been connected to those data sources.
All we have to do is click the refresh button.
I’ve already cached the credentials to the SQL database that we’re connecting to in the back end here, so it doesn’t even prompt for those.
In this case, it wires up our data set and you can see here at the presentation layer.
This is the equivalent of what you would see in Cognos when building those reports.
We can look behind the scenes a little bit and see that the Cognos namespaces have been transformed into a physical namespace set of tables, a business table, and then lastly our presentation layer table.
And the other namespaces, the business layer and the physical layer are hidden from view so that when your report designers start using this data set in the future, they just see the presentation layer as would be the case in Cognos.
So that’s super easy.
We’re actually ready to publish this already.
So we’re going to publish it.
That’ll prompt us to save changes.
We’re going to select a webinar reports webinar workspace and it’s going to publish it out.
And we should show those.
Yep, See this show up and our workspace there.
Let’s close out of there and let’s go back to our reports.
So we’ll start with the simple report.
And similarly, this report is a BI project.
When we open it up, it is also not going to know about its data source.
So we’re going to see an error or two.
Something’s wrong and we just all we have to do is point it at that data set we just published.
So pick Power BI data set.
It already has at the top of the list, the most recently published data set for me and the webinar workspace.
So we’re going to connect that and this list of data is going to look the same as this list of data over in Cognos.
Similarly, we can publish this report to that same workspace and we are ready to then share with our end users.
Give me a second to refresh here.
There it is.
So there’s our simple record report and so on and so forth.
Lastly, let’s go ahead and pull up that slightly more complex report.
Multiple pages, multiple visuals on the page.
Some of the like pixel perfect layout aspects of this aren’t retained at this point.
It’s kind of a newer effort.
So as we build it out, those pieces and parts will come along.
But for now we do kind of just kind of, you can see how we’ve just put the page in half and put two visuals on it.
If there were three visuals, we’d split the page in the thirds.
But you know there is some Polish and validation steps where you might come in here and realign things.
We’ve noticed some like the default sort orders might be different between the systems, things like that.
So there’s a little bit of cleanup.
There’s our second page, we have our everything happy last week, same exercise of saving and publishing which we’ll just go ahead and skip for now.
So last thing I want to touch on here is that during the conversion process we had this warning, which is this case we can injected a warning about skipping.
There was like a calculated column expression you didn’t know how to figure out.
And what you have to realize is all these calculated columns and Cognos need to be converted to DAX.
And so we’ve kind of begun that mapping process of how to translate between those two languages.
But as we go and discover new more in different data working with our customers, you know, we need to know that there were unsupported expressions in there.
So that’s helpful.
But what’s most helpful is we have this GitHub integration and so we’re going to just go ahead and publish all of L and of course I’ve already used that name.
We’ll do Number 2, and what it’s going to do is it’s going to make a GitHub repository out of the outputs that are application generated.
And so if you’re familiar with GitHub, it’s a source control system.
It also has some issues an issue tracking system with it.
And so we’re making an issue per report and per data set and it’s that quick and we can go out and look at it and you can see that it created an issue for the package that we brought over as well as the two reports that we brought over.
And then it bundled the code changes for each of those things at the pull request.
So I can go and look at the pull request and you see the nice set of data.
So these are the files related to the data set and you can see going forward if you’re familiar with GitHub how once this is in here, as I make changes, I’m now going to start seeing change sets.
If you’ve used GitHub with Power BI, Pbix files or Pbit files, all you get is hey this is raw data and it’s changed.
Now you actually get visibility into what changed.
It helps reduce any sort of unintended changes from getting into your BI system.
So a great way to kind of formalize your BI process of code reviews development.
The actions part of GitHub allows you to do DevOps.
So you could have this set up so that as soon as you commit a Power BI change, it gets deployed to a test area in Power BI service.
Things like that, if there were any polished things.
So again, here’s that skip statement.
You know, we note those in the tickets, things we couldn’t handle.
And then you have a Power BI expert come and review this and say, oh, I need to go convert this into decks.
I know how to do that.
Great, they Polish it up.
Validation and off you go.
So I’m going to conclude our demo.
There Arturo if you want to grab the screen share back for the deck.
Thank you, Sir.
And let’s just recap a little bit.
So it’s a code driven solution for data model migration and you could see how you could batch up reports and do you know literally hundreds of reports at a time.
It does create the GitHub code repository, the folder structure, the issues.
It’s basically ready to kind of hand over to your department or company and you take it from there as far as owning that GitHub repository for the future.
You can include DevOps and CICD pipelines and other fancy stuff in there as you get more and more advanced with this, but a great system for the long term and then if you’ll get a one more slide.
So our ask at this point is we are looking for beta clients to partner with as we kind of fully build out all of the corners and features of Cognos data models and reports.
And this tool.
We’ve already been asked a couple times, are you open to you know, publishing to Cognos or from Tableau or you know, other types of system request and that is something we would look to partner with folks on if you.
So if you have that need and you like the framework and the GitHub integration and kind of the overall approach here, would love to discuss that with you.
And you can see there the contact information if you’re interested, [email protected].
Also Kay Knowles is on the line as is Carson Bradshaw.
So if feel free to reach out to them directly either here in the chatter, if that’s someone you normally talk to as you work with Senturus.
And with that, Steve, I am going to hand it back to you.
Thank you, both Arturo and Bob.
Just a little bit of quick overview here before we head into the Q&A section.
Just want to draw everybody’s attention to additional resources on the Senturus.com website.
We have a ton of information out there.
We’ve been committed to sharing our knowledge for a good couple of decades now, so check us out at Senturus.com/resources.
You’ll find tech tips.
You’ll find product demos.
You’ll find recordings of this well, this webinar.
In a week, you’ll find a recording.
You’ll also find Recordings index for past webinars, so check us out Senturus.com/resources.
And a few upcoming events want to draw your attention to coming up on October 5th, Power BI Administration and Data Governance.
If you’ve attended many of our webinars, you probably know Todd Schumann, who often does our Cognos Admin webinars.
He’ll be delivering the Power BI Admin webinar in a couple of weeks.
Beyond that, we’ve got Microsoft Fabric and you.
For anybody who’s interested in the new details of the Microsoft Fabric offerings, the ever popular Pat Powers, we’ll be delivering that on October 12th and we have a chat with Pat coming up on October 18th on publishing and using the Power BI service.
So please join us for any or all of those things.
You can register at senturus.com/events a little bit of overview on Senturus.
We do concentrate on BI modernizations and migrations across the entire BI stack.
More and more we find that folks are either wanting to coexist or to migrate so we can help you regardless of where you are.
In that process, we offer many flexible ways to work with us, everything ranging from small and large projects to ad hoc Co development, enablement and training offerings.
We’d love to put our 22 years of experience to work for you, so don’t hesitate to reach out with any and all of your BI and analytics needs.
Jumping into Q&A here, we do have a few questions that came in during the session.
So I’m just going to kind of open it up here and I’m going to just kind of talk through some of these with both you and Bob, Arturo.
So starting out, one of the questions we got, actually we got a few questions that are kind of overlapping here.
So I’m just going to combine them.
There’s as usual, we have some questions related to the Cognos licensing and what is the impact of the connector on licensing.
I don’t know if you want to maybe take that Bob or I can kind of provide the overview if you prefer.
Yeah, in general.
So I’ll take the impact side.
I’ll let you speak to the licensing.
So the impact of using the connector is like using an interactive user in Cognos.
So Power BI and Tableau both tend to send a lot of queries as you’re clicking around dashboards and cross filtering and drilling down and things like that.
So you can expect that sort of load to come through to the Cognos system, but again at the per user level.
And I’ll let you tackle licensing, Steve.
So on the licensing side, a couple of you pointed out that one of the main reasons for leaving Cognos this cost and there was an inquiry or a couple questions here about what is that licensing impact.
And you’re correct that you do have to maintain Cognos licensing if you’re using the connector because effectively what we’re doing is we’re passing a Cognos report spec through, you know, directly to your Cognos environment from either Power BI or Tableau and that executes in your Cognos environment as a Cognos user.
So you do have to maintain Cognos licensing and that’s also desirable because you want to keep that audit trail per user, you want to keep that security per user in place, right.
So that’s kind of kind of the whole point on that side of the world of why even though everyone is saving money.
And actually kind of a related piece of this which came from another question that came through is, you know, would you use this if you’re moving off of Cognos completely?
Is there actually a use case for the connector?
And the answer is that ultimately no.
Like once you’re off of Cognos, the connector is of no value to you, right, because the connector is designed to pull data from Cognos into other tools.
But what we find as many organizations use the connector as a bridge when they’re beginning a migration, because what it does do is it allows you to very quickly start pulling your already modeled, already curated Cognos data into these other tools.
So it’s a very fast path to using Power BI or Tableau as a front end to your Cognos data while you work on the much bigger, heavier project of moving off of Cognos.
So we do find that customers end up using the connector for a few years and then they drop it when they’ve moved onto a new platform completely.
Having said that, this isn’t a topic for today’s, but we do have another migration related product that often gets used as the secondary stage of that to get you from Cognos to Power BI and the Tableau.
I’ll add on to that, that as you start doing that move to Power BI, the connector allows you also to train up those users without those users needing to learn a new data set at the same time.
So if you’re looking at a very familiar data set and now you’re just puzzling out, how do I, learn the Power BI tooling versus, you know, both trying to figure out a new data set and a new tool at the same time.
So from that same accelerator scaling up, enabling workforce type idea, the connector can be very useful there as well.
It’s a great way to reduce the kind of reskilling effort when you’re moving everybody to a new platform.
I mean if you have to learn how to model and do all of the remodeling especially because of Cognos models are already, you know have years and years of investment in them, right.
They’re complex models.
They have tons of business logic in them and because recreating that is both time consuming and potentially error prone, you know the connector can really speed things along.
There’s another question here, I don’t know the answer to this, but I suspect you tech guys might be able to cover this one.
So Chris asks, could we add a data source to merge the data that comes out of the connector, let’s say a spreadsheet or another DB table.
So I don’t know.
I mean I would think that’s a feature either in Power BI and or in Tableau to be able to kind of combine multiple data sources.
But I don’t know Bob Arturo yeah I know and I think in Tableau the terms data blending where you’re taking multiple data sources and then you have to kind of create the relationship between those data sources so that they cross filter.
And Power BI, there’s, you know, composite models I believe is the term for it over there where you’re able to take one multiple data sets and combine them and use them in the same report.
We definitely have some highly skilled both Tableau and Power BI folks.
So if that answer is not technically correct, we’ll be sure to update that in the Q&A.
But fairly, certain that’s the approach you would take.
All right, so I think that covers the core of the questions that came in.
So I’m going to go ahead and I saw one more.
I was going to actually just ping our true on this last one here, the connector and like the connection style.
So is it ODBC?
Is C ohh, etc.
Yeah, it came in right at the end there.
Now I see it there.
So we support both connection protocols.
I know we have an ODBC client and we also support a SQL Server Native client, so whichever fits your use case better both would do.
But our primary kind of focus of the tool is connecting them to Power BI and Tableau.
We have had some folks test it as far as doing like SSNS based querying which is what this is getting into or our studio Python.
Happy to have that conversation with you and just to kind of fully understand where you’re trying to go with it and to see if the connector would be a good fit for that.
Thank you both.
Before we wrap up, I just want to cover a few more housekeeping items.
Just cuz you know, Kay has been chatting in the chat window and I want to make sure everybody saw per note there about our Microsoft tool comparison matrix which was recently updated.
So if you’re using the Microsoft stack, definitely check out our updated matrix that has a lot of great information about the current state of Microsoft analytics technologies.
So don’t miss out on that.
And just to repeat, if you are interested in a trial of the analytics connector and we’re interested in talking about being a beta client for the Report InstaMigrator, don’t hesitate to reach out to us.
We’d love to hear from you.
And with that, I’m going to say thank you to Bob and Arturo for presenting today.
Appreciate having you guys here and thank you to everybody for joining us today.
It’s always great to have you join our Senturus webinars and we do hope to see you on a future webinar.
So thanks everybody and have a great day.