KPIs from Multiple Sources, in Minutes

Creating Power BI or Tableau reports from data in multiple source systems can require a Cirque du Soleil of convoluted data manipulation. The ETL work to go from raw data to dimensionally modelled data that provides accessible, insightful analysis can take months. Modeling data directly in Tableau or Power BI also delays analysis and silos the cleansed data within one BI platform. And don’t even get us started on using Excel for combining data!

In this on-demand webinar, learn how you can get deep, timely business analytics from multiple data sources in minutes. Find out how you can eliminate ETL and rigid star schemas and align business metrics across all your visualization tools.

We introduce Incorta and demo its agile, near-real time operational analytics platform. Using popular CRM, ERP and project management systems as examples. we show how Incorta quickly and easily connects mutli-source data for analysis and sharing in a drillable Power BI or Tableau report.

Presenter

Michael Weinhauer
Senior Sales Engineer
Incorta

Michael has been designing, delivering and selling analytics solutions for over 28 years. Before joining Incorta, he was at Senturus and instrumental in the development of the Senturus Analytics Connector, which lets Tableau and Power BI use Cognos as a data source. During his career, he has gained a wealth of hands-on, practical BI and big data experience in solutions architect and sales roles at Oracle, IBM and SAP.

Machine transcript

Welcome to today’s Senturus webinar on KPIs from multiple sources in minutes.

0:13
Thanks for joining us today. A quick housekeeping before we get started. In your GoToWebinar control panel, which is off to the right on your screen right now, you’ll see that there’s a question panel available there.

0:25
You can type questions into that panel during the webinar. We will do a Q&A session at the end of today’s presentation. You can type questions anytime you like into that question.

0:44
One of the first questions we usually get is whether you can obtain a copy of today’s presentation. Of course, you can go to Senturus.com/resources, and you’ll be able to download the deck of today’s presentation.

0:59
And beyond that, the agenda for today, we’ll do some quick introductions, talk about challenges to accessing timely insights, do an overview of Incorta’s unified platform for data analytics. We’ll do a quick demo.

1:14
And then we’ll do a bit of an overview of Senturus and some additional resources, and as I said, we’ll wrap up with Q&A session.

1:23
For today’s presentation, we are lucky to have Mr Michael Weinhauer here. As many of you know, Mike is a Senturus alum, now he’s a senior Sales engineer, at a Incorta. Mike has immense depth and breadth of experience across the analytics space. Like many of us here, he’s been doing this for more than a decade or two.

1:45
So it’s great to have Mike here today.

1:47
I’m Steve Reed Pittman, director of Enterprise Architecture and Engineering here at Senturus. And with that, I’m going to turn it over to Mike.

1:59
Excellent. Well, thanks, Dave. Thanks for having me back here. It’s good to be back.

2:05
We like to get a sense for the pulse of our audience, if you will, or the topology of R of the clients and attendees. So, we got two poles here today. What are your core systems of record, your ERP, or CRM, or whatever?

2:28
He’s an Oracle, any flavor of that, EBS fusion, etc, etc. Even I wouldn’t bucket JD Edwards. Are people soft in there?

2:37
Net Suite, SAP, Salesforce or some other system.

2:44
Obviously, it’s a single choice.

2:47
Give you guys a few more seconds to get this answer, to get about two thirds of you with your votes in.

2:55
Because it’s just coming off the election a few weeks ago, you should be good at voting.

3:04
All right, 72%, I’m going to close it out, share back.

3:08
So you should be seeing the poll results, they’re so good. A little more than half are on Oracle or some permutation of Oracle NetSuite customers, interesting, 15% SAP Hana Core other.

3:21
By the way, if you want to comment on what the other might be, our polling is sort of limited in terms of the number of options we can provide.

3:33
You can go ahead and put that in there, questions, window, or chat.

3:37
We have one other poll here. So I’m going to click over the next slide.

3:47
What are the biggest challenges in your existing analytics environment? And this is a multiple choice.

3:55
So is it, is it time to insights? Meaning you have is it backlogs in IT and long ETL pipelines, you know? These are kind of leading questions. I have never met a customer. I was like, no, that’s not a problem.

4:08
So I expect a lot of people to everyone, pretty much, to check that one.

4:13
Is it debit building, they’re responding to changes.

4:15
So the median column, what’s the next business question on the business changes, or you have mergers, acquisitions, maybe.

4:23
Those are always things that create a whole bunch of chaos and how quickly and how comprehensively are you able to respond to those changes as an organization and really leverage data?

4:35
Is it the inability to draw from high level KPIs to granular details?

4:39
In other words, a lot of cases end up with summary level detail, but can you get down? What does it take for you to get down and do?

4:46
Well, yeah, why is that red light blinking red?

4:48
And, or is it like a self-service, you know, self-service is the holy grail analytics.

4:54
And organizations have achieved a greater or lesser extent of success there, You know, owing to the fact that it’s combination of technology, and people and processes, and it’s not easy to, It’s not easy to change all those and move all those levers, or, again, is it something else?

5:15
I’ll close this, and share it back, again, about three quarters of all sharing your insights. Thank you for doing that. And I’m only 43% said, It’s time to insight, so It’s made a liar out of me.

5:28
About a third have difficulty responding to changes, again, same percentage, closing in on half inability to get from high level to granular details.

5:37
The base chunk, their lack of self-service, and then 43% other.

5:40
Another one that I’m curious about what the other is, It’d be a lot of different things.

5:46
But anyway, that’s always fascinating to see what people are encountering, and again, sort of what the lay of the land is, if you will.

5:56
So, with that, I want to dive into our topic today, which is getting key insights and KPIs from multiple sources in minutes.

6:09
You know, minutes might be a bit of a stretch, but you can get to it pretty quickly or with in quarter. One quarter specialize in is real-time analytics on raw business data.

6:20
So, I’ll start off my caveat in real time because I know somebody’s always going to say, oh, real-time, you know, like streaming data, or you know, vibration and temperature sensors off of equipment and stuff like that.

6:33
We’re not talking about that real time, We mean more in the operational sense. However, we do have customers that are refreshing that data as frequently as every 5 to 10 minutes. So, that’s what we mean by real-time there.

6:48
And the key is raw business data here. And I’ll get into that in just a second here as we get you access to all the data.

6:56
So, the challenges that we see people come to us for, are, you know, again, kind of tying back to those polls are that, you know, why, you need to make data driven decisions.

7:08
Now, while the question you’re asking And the decision to make is still relevant.

7:13
The landscape keeps changing in business and, having those, those long, complex data pipelines makes it really hard to answer those questions quickly.

7:25
And not having all that dark data, right? You end up sacrificing a lot at the altar of performance.

7:31
And you’re looking at summary level detail, and it’s difficult to drill down very far, you know, much further than a level, or two, and you really need to go 670.

7:41
And, you know, the all burden of analytics on IT, people aren’t spending, no, they’re not looking to expand their IT organizations, especially now.

7:52
The economy being on all shakier flooding, they’re trying to do more with less.

7:57
And, you want to create, you know, self-service.

8:00
So, not only to empower business users, and allow them to answer their own questions or answer the next question, but to reduce that backlog, and, again, do more with less with lean IT staff.

8:12
So, if that sounds familiar to you, you know the quote, unquote, modern data architecture, that so many of us, have built our careers on in our, are used to, is really kind of an anathema to those three things that I just mentioned on the prior slide.

8:31
Right. And so this probably looks very familiar to pretty much everybody on the line here.

8:35
You’ve got your Systems of record.

8:37
You use some type of a of an ETL tool along the way here to land that data, perhaps into a data lake, maybe not, but or an operational data store, or something like that. And then more ETL happens where it lands and perhaps a data warehouse, or some facsimile of that.

8:56
Where in about 75% of that data is being sacrificed, rights being left behind is being aggregated and whatnot.

9:05
Then it’s further aggregated. and again, sort of obfuscated, I’ll say, as it’s landed in things like data marts.

9:13
And then I would argue even further as we started pushing it out to the end users. So every one of these arrows is code.

9:23
It’s time, it’s money, and it’s, in agility.

9:28
And then I’ll go further.

9:30
What we’re going to touch upon today is not only sort of how Encarta addresses this, but how we give you flexibility in this without leading to even more metadata that you’re getting up creating and in Tableau or Power Query, or Power BI or Click metadata. And you can do it all kind of in one place. And you’re all saying it off the same sheet of music.

9:54
So this is not really kind of like this is sub optimal, for the aforementioned challenges of being agile and answer questions while they’re relevant, things like that. So, again, the, you know, the typical flow is you have a question, call IT. Get on there, rather length, lengthy list. I can pull the data in.

10:14
Then, every new question, you’ve got to kind of go back to this whole process.

10:18
So, Incorta came about because there are a bunch of guys actually from Oracle that were in their applications area that we’re like, Well, what if I had a question? And, I can already see it and load it myself and have insights within minutes.

10:31
And so they created Incorta which the real difference is here is that what we do is we pull data directly in from those sources without any ETL we make, we call it digital twin, of that data. So it’s really easy light touch on that sort set source system. We pull it in, we put it very rapidly, so we can pull in billions of records across multiple systems, parallel loading, things like that.

10:59
And then you get direct line of sight back into all of that data.

11:02
You’re not having to trim it down because you can’t pull in that much data, or it’s going to be too big desk, we just pull it right in, Then we expose it via, either directly or through business friendly Business views, which are exactly what you might think they are. Right, kind of standard dimensions and measures that, that you’re, that, facilitate self-service and whatever tool it is, whether it’s Incorta or you are using our own internal analyzer tool or via external tools like Tableau and Power BI which are pretty much ubiquitous.

11:37
I would guarantee in almost any organization above a certain size, you end up having, you definitely have Excel, like way too much of it.

11:46
You probably have Tableau, and you probably also have Power BI.

11:50
Additionally, you oftentimes, people use Incorta as a data hub because we free the data from, say, Oracle Cloud, right, or the, the complexities and myriad, sort of, Byzantine structures and op and all that good stuff from sap.

12:10
And, pull that in and enable us to push that up to something like Azure Synapse analytics or push it out to a black line or an anti plan. Or something to use for planning purposes.

12:22
With, drill back to, the Rock record level sub ledger detailed data, to really facilitate speeding those types of processes, right, closing the books, budgeting, planning and forecasting, and whatnot.

12:36
So it’s really a fundamentally different approach to collapsing and eliminating those pipelines and really creating agility in a world where, you know, you want to do more with less and need to answer questions quickly.

12:51
So, the benefits are, we give you unrivaled data access, you run those analytics directly on the application data, That’s 100% identical to the source.

13:00
You have complete control over the governance and access and the lineage That’s verifiable all the way down to the individual record level details.

13:09
Right. So your lineage instead of being, you know, when you’re looking at lineage and a lot of other tools, you have to go, oh yeah, this is Tableau, that came from this extract.

13:16
Which came from, you know, Bob’s, SQL server owners, desk, or the system. And it’s combined with this. And by the time you trace it, right, it’s like, kind of going back to the ancient Egyptians.

13:29
Here, it’s one line. Hey, it’s like, Oh, this field comes from this field, in my ERP system, or this field in Salesforce, and you can see that easy line of sight.

13:39
And we provide this really fast time to insights where you can deliver latest data in minutes, quickly join other sources, and provide sub second queries. And that’s really weird, Incorta court is different.

13:50
So I’ll ask you to suspend your disbelief in a moment for a moment and understand how this really shakes up that landscape and that traditional way of thinking. So how do we accomplish that Incorta?

14:06
Well, we have a technology called Direct Data Mapping.

14:10
And really, the platform has several technologies that really facilitate all of this. First of all, that parallel data loading that I talked about, both full and incremental, by the way. So we pull and make that digital twin.

14:23
Then we do an incremental load every 5, 10, 15 minutes, whatever your business requires. And we’re able to do that really quickly. Don’t have to do a bunch of ETL or any ETL on it and a lot of cases. So that we’re able to do that very quickly without bogging those systems down and give you those insights.

14:40
And the direct data mapping is really this enriched metadata map and smart query routing.

14:46
So the most expensive part, and I’ll show you this in the demonstration, the most expensive and complicated and difficult part for any database is the joint.

14:54
That’s the plan.

14:56
Because of the optimizer and even the most modern, no databases with them, with the latest technology and the most horsepower, they tell you, today, I don’t go beyond really 3 or 4 joins if you’re doing, you know, with a volume of data.

15:12
And so what Incorta doesn’t, direct data mapping is every point of data that we ingest is aware of how it relates to every other point of data.

15:20
And we pre plan the join paths between those data points. So we don’t have to plan the query at runtime.

15:29
And that’s really what’s fundamentally different.

15:31
So we are able, to query billions of rows, and do hundreds of joins across thousands of tables across multiple sources.

15:42
So think about that for a second. That’s really the difference, and it really works. Well, it’s pretty amazing. And we load this.

15:49
We loaded into memory, and you’re able to hit it with a Incorta or with any of these other tools, which I’m going to show you, in just a little bit here. So, I’m going to show you how we’re able to do that, and kind of get walk you through how the platform does that.

16:05
Then, this is just showing you the business schema. This is a snapshot of what a quarter looks like. I’m going to walk you through this in a little bit, but we gave you this ability to create those simplified views, because obviously you, SEP folks out there or even Oracle.

16:18
You know, it’s, you wouldn’t want to present to anybody with an ERP type schema, or those, a lot of a lot of cases, those names.

16:26
And you have other logic that you want to have in here, like, calculations, are aging, buckets, currency, conversions, things of that nature.

16:33
So, you can put those into what we call a Business View that is much more friendly for your business analysts, and allows you to maintain data governance controls, and, and, you know, sort of control who has access to that data while enabling self-service, all within the same platform. And then, importantly, we provide a set of data apps.

16:54
And there’s a reason we started asked about Oracle sap, Salesforce NetSuite Workday because we have pre-built content across a wide array of No business processes in order to cash, procure to pay, AP slash L All that stuff built for a lot of the core systems like Oracle, EBS, Fusion, JD, sap, Salesforce, NetSuite, etc., etc.

17:20
And this is along it ever growing list of applications. So, the just of that is that we can, we have the connections, we have the knowledge of the schemas, and what data to pull and how to pull that from those systems.

17:34
Then those friendly business views on top of that and content dashboards and whatnot, that are pre-built, they get you know, a good chunk of the way there to spinning this up and seeing value out of the system really, really quickly.

17:51
Like me, when I first heard about it before, And I was like, yeah, right?

17:56
So, what I’m going to do now is I’m going to, show you the application and show you how we actually accomplish some of this stuff. So, the scenario I want to share with you today is what we call an is called a Customer 360 Analysis.

18:11
And I love this scenario because almost any organization has customers, clients, wherever you want to call them, and they always want to have, ideally, a better view of them, a more comprehensive view of them.

18:29
In some businesses, it’s more critical than others.

18:32
But what’s also universalist that that information invariably lives in a whole bunch of different systems.

18:40
You’ve got your ERP system that has, they buy, for example, you have your CRM systems. What were they? What would they might, what might they buy? Sorry, I can form sentences here too much, caffeine.

18:52
And then ticketing systems, for example, that are, like, well, what are the problems they’re having?

18:55
Where, what kind of support are they, are they requiring? Or are they, are they just not using the product? We don’t see opportunities, we don’t see support. Maybe their customer that we’re at risk.

19:07
And then, in order to get that view, sort of say, Oh, am I going to lose this customer?

19:12
Kind of cross, sell up, sell, no, you, What opportunities do I have, and are they, how do they relate to how, well, how happy the customers?

19:22
It takes forever to get that data together around your customer retention churn, because your plan and spreadsheets, you run it through ETL very complicated, very difficult, if you can do it at all.

19:32
And that data, a lot of times, is in the, In the data warehouse, especially, not at the level that you need, or the currency you need to really get a full picture of your customer. Right? And that’s especially true for more smaller transaction customers like, imagine Starbucks. You’re going to provide a cup of coffee.

19:52
They have like a few minutes to cellular that scope, Which makes a big difference in their profit margins, right?

19:58
So, you got the right product there, at the right time, given the right offer, all that sort of stuff, so Customer 360 Analysis is something that, you know, like is pretty relatable to everybody.

20:07
So, the demo I want to show you today is really a demo where we actually do combine it in quarter, ERP data, service ticketing data, and CRM data, Oracle ERP, jira, and Salesforce, in this example. But it could be anything.

20:20
And what we do is we’re pulling the data.

20:22
We integrate it.

20:24
We provide that reporting off of that.

20:27
Then also provide the ability then to drill back into those systems with context.

20:33
And report off of that either within Incorta or with a lot of the tools like Power BI and Tableau. At your organizations.

20:40
Know, and love using, again, that same sheet of Music: Right.

20:44
That same business view, that’s going to turn, turn up the same numbers across departments and across your organization, because you’re all using the same query engine, The same business logic that you put into that, and it leads to better decisions on more current information.

21:03
All right. So, I’m going to jump over here to my demonstration. Just going to log back into my system here.

21:22
Really doesn’t take very long.

21:25
All right. So, this is the Incorta application. Everything is browser based.

21:29
Everything you do from system administration to data acquisition, schema management, those friendly business views, and content creation, is done through the browser, So that means you can embed it in Salesforce or vice versa. You can drill back and forth.

21:45
There’s a bunch of stuff you can do here.

21:49
For purposes of Customer 360 demonstration, when I’m first going to show you, and I apologize. It’s a bit of a Christmas tree. I didn’t pick colors on this thing.

21:58
It’s a 360 customer retention dashboard.

22:04
And what it’s doing is it is showing information across three different systems that ERP system, the CRM system, and the ticketing system.

22:15
Of course, it’s taking its sweet time here because of, because I’m on a webinar, it’s always slower on these things.

22:21
Generally, these dashboards do pop up in seconds.

22:26
But what I want to show you here is, what I’ve got here is revenue.

22:30
From my ERP system, I have the query sold from my ear pieces as well, forecast revenue from Salesforce, and issues from my ticketing system.

22:39
So note taking note of the numbers here, then I’ve got a bunch of other visualizations that show my top accounts by revenue. So that’s ERP based revenue by state, you know, geo based, same information.

22:52
This is all Salesforce data showing my win rates and whatnot and these are all my high priority tickets for my top five accounts or for top five in terms of total tickets.

23:03
And then Importantly, what I’ve done here is this individual, Customer Transaction Level Information.

23:09
The reason I show this is because what we’re talking about here is this would be really tricky for a lot of systems to pull off.

23:18
To be able to show this information together and then be able to say, Click on, you know, Staples and have it instantaneously, drill down on all of those systems. And show me that, Hey, I’ve got 55 million in revenue here.

23:33
55.04, I have no, roughly one thousand items, or, you know, 800 some odd items that I assault.

23:39
I don’t have any forecast revenue. Because the deals are all lost. Youngest 76 issues. Here’s how they sort of break down from my high priority issues.

23:47
But then, I have, you know, for 4652 actual transaction lines, and that’s filtered that down automatically, across with data from all of those systems.

23:58
Now, you might look at this dashboard initially and say, OK, big deal, right? I can do this in Power BI or, Tableau, or whatever.

24:07
But, the reality is that, doing this in Incorta, I’m just going to drill into one of these insights here, and this is R, our Insite Builder, and I will show you what the query looks like.

24:19
And this is actually going from our accounts table to the transactions down to this Transaction Lines, all table, which has billions of records, right? It’s very big.

24:30
And so what I’m going to do is sort of show you how we’re able to do this without a bunch of ETL and get to this dashboard pretty quickly, both in Incorta and outside of the Incorta using some common tools.

24:43
So kind of stepping back from this, the first step is acquisition of the data, and we do that, like a lot of tools through connectors, and we have hundreds of connectors.

24:54
We have a native connectors that are in Incorta. built-in support it.

24:58
They also have a partnership with, with C data, who has hundreds of connectors, so we can connect to virtually anything.

25:05
If you’re getting data out of your system now, whether it’s from flat files, native database connectivity, on prem in the cloud applications, like I said, you know, Oracle, NetSuite has various ways of connecting to it. We support all of those Salesforce, you know, its latest sort of two conductivity, that has a bunch of new functionality.

25:25
Data lakes, file systems, things like Google Sheets, and then, you know, rest APIs and whatever. And then, even all the way down to basic kind of ODBC and JDBC.

25:36
And what we do is you simply use whatever that that particular means of conductivity is.

25:45
And using a JDBC driver, you’re pointing to the host for that, system, giving it a username and password, or passing it, o-auth, credentials, whatever it is.

25:55
That provides you with the connectivity to that system.

25:59
You could also have local data files.

26:00
So in a lot of cases, you might have flat file data that’s coming over and pull it over from FTP, SMTP, uploading the files, whatever it is. We can pull that data in as well.

26:10
And then, we have data destinations that are sort of like that black line or the, you know, the planning systems or whatever, when we’re serving as a data hub or Azure Synapse, things like that.

26:20
So, once you’ve created those connections, then we go and create a schema.

26:25
And this is where we actually go query that system and pull that data all into Incorta.

26:31
So, this is an example of a schema that we pull in using our data apps.

26:37
And this is just like an standard database schema, right?

26:41
But the difference is, is we pulled the data in, and we pull it just like this, right? This is a big hairball, right?

26:50
You would never expose something like this, generally, to any sort of front end tool.

26:57
Because it just tip over and die, right?

26:59
Anybody with any analytics experience will tell you that, even joining a few tables here in this ERP system without doing a bunch of ETL denormalizing this stuff, and reducing those joins.

27:10
Well, it would issue a light Jim inquiry that people screaming at you and your tools are tip over.

27:14
Much less combining that with Salesforce, with jira Service Desk, or, you know, data science models or anything like that, and attempting to query across those.

27:26
So the difference here is that in court, it actually pulls these literally and think about it, like a select star from table A, select star, or, you know, call them A, B, and C from table B, And we’re able to pull in, this is two billion rows.

27:39
And you can see if I sort of short this really quickly, my transaction lines, all that 471 million records of data.

27:46
So the dashboard that I was showing you before, or I’m pulling in the transaction lines all, that’s actually pulling in that information for those top five and filtering that out of 471 million records in real-time.

27:58
And then it’s doing the other queries against Salesforce and Jira, and from a data science thing that I’m a customer churn model, that I’m not showing in this particular demo.

28:08
But now, we’re able to pull in that 471 million records very quickly about six gigabytes an hour.

28:13
So we’re able to load this thing about 39 minutes, All these tables, all those two billion records.

28:20
And it’s only 46 gigabytes on disk.

28:23
Because we’re not exploiting that out, and you normalizing it, we’re able to do this with relatively commodity hardware quarter.

28:31
We’re cloud first.

28:32
We do have an on prem offering, but this is a pretty basic Linux cluster.

28:37
I want to say it’s about a 8 to 12 course and like, 32 gigabytes of ram, so nothing crazy.

28:44
This isn’t a Teradata, sap hana. Let’s use that kind of giant box. Pretty straightforward commodity, Linux, Box that can pull this off because we’re pulling it straight over.

28:55
We’re not doing a bunch of ETL and exploiting it and we’re landing it in highly compressed parquet files as it sits in that system, then we can do, again, incremental loads on that data.

29:07
So the Transaction Lines All table, we would want to update that frequently, because think your Starbucks, right? Every 5, 10 minutes, you got a whole bunch more data. You want to pull in? This one changes a lot. The GL Code Accommodations, probably not so much, right? So you would maybe load that daily or not touch it very often.

29:25
So you have complete flexibility around this.

29:27
But the point is all this data in, Then, what I’ve done is, I’ve done the same thing, Bor, Salesforce.

29:36
So I pulled this in. Here’s all the typical Salesforce objects that you’d see. Right, your opportunities, your leads, your accounts.

29:42
Again, going into the diagram, we’re pulling that straight in, as it is, including things like the history, so you can do all that sort of zero point in time reporting and whatnot.

29:51
And again, we’re sort of joining that up, too, the accounts receivable, which is our ERP system, and we’re joining that out to our jira Service Desk, so we know our customer support issues.

30:03
Then the last one is the jira Service desk, and connecting to that system, and pulling in all the information from, that that we want, right?

30:15
We have our roles, our issues, and everything around that and what we’re doing is we are, in this case using from a business perspective, we are linking all of these to, our customer accounts table, which is serving as our customer master.

30:38
Now, you might use an MDM solution or, in some cases, customers are using Incorta.

30:43
I’d actually run fuzzy logic against that to do some of that data cleansing and that data governance, but however you’re doing it, generally, you’re going to have sort of a gold standard right for your customer account. So, in this case, what we’ve done for the customer counts as the ERP system serves as the customer master.

31:00
And Salesforce has a custom field that has customer ID, and all we’re doing is we’re joining at account customer ID to my customer accounts master table on customer account ID in the ERP system.

31:17
And we’re able to do that, again, because we are pulling the data in at the lowest level of detail.

31:24
So it’s a lot simpler to do that.

31:27
Then, on the jira Service Desk, I’m doing the same thing, I have an Account key, and I have some Account Key logic that I’m doing in here. I can actually go in here, too.

31:36
I’m going to go into that table and show you the joins here.

31:43
Alright, so here’s that. Join.

31:46
There’s actually some logic in here.

31:51
So you’re not sort of limited by be just doing those joins. You can actually do more complex logic on that.

32:00
So I’ll go in here and show you this real quick.

32:02
So, where I’m customer ID string to customer account ID, I’m doing a string function.

32:12
And in this one, I’m actually doing a formula or I’m sub streaming that bucket in those distributed versus retail so that I’m able to then join that out too, that other system, right?

32:23
So, that’s just a simple calculation and Incorta that I can create, either at this schema level, and then it gets, sort of built into this table, or, you know, I can do it at any level here.

32:37
So, again, so now I’ve got the three systems connected to, I’ve pulled in all the data in those systems, and I’m maybe updating that every few minutes.

32:46
And I’ve created relationships across those, so now I’m able to do some, I’m able to do reporting across those in query across all of those, but, you know, giving people, again, that sort of a view of the information would be kind of a nightmare, right? Sending your self-service users screaming.

33:02
So what we have again, are these business schemas, which are these friendly business views that allow you to present to your end users a much friendlier view of that data.

33:13
Now, what I’ve done here is created a Customer 360 view that combines, that has a bunch of the accounts receivable data.

33:22
You give it a friendly name, pulls in some of the, a bunch of the Salesforce opportunity information, again, that jira Service Desk information and a bunch more of the Salesforce opportunity information.

33:36
And as well, there is logic in here.

33:39
Calculations, same calculation editor, same sort of functionality where I’m creating opportunity status buckets based on the fields in Salesforce, right?

33:48
So, now, this is a subset of that data that incorporates all three of those.

33:55
And now I’m able to go ahead and say, oh, yeah, you know what, I want to look at, you know, by month, and by country, and I can just go ahead and pull in information here, and say, you know, this is my, you know, my revenue.

34:12
Introductory table, and you can see how I’m pulling data from that two billion records system, and I just pulled this in seconds.

34:20
And so, it’s very, very quick, very, very easy to do that, because of that direct data mapping, and the way we store it, and the way we load the data into memory.

34:29
And from there, what I do is I can create those insights.

34:33
And I end up back with my, I combine those into dashboards that have, you know, bar charts and maps and things of that nature.

34:44
So that’s kind of how we get from point A to point B So you can see where without having to do all of that ETL.

34:52
And pulling in all of the data that not only can I do that very rapidly.

34:59
I pulled in two billion records in about 39 minutes.

35:02
Obviously, I didn’t do that here.

35:04
There’s another webinar that we did with Senturus, some, I want to say about two months ago, where I do an overview and actually do load about 25 million records.

35:12
It loads in about a minute, so, you don’t take my word for it.

35:16
But, we do, we’ve loaded, we load that data, created those joins, not too complicated, right?

35:22
Created a business schema where you pull in the fields you want, maybe throw out a few calculations.

35:27
And boom, we’re off to the races creating content that is doing queries across these three systems and giving me insights into that from high level KPIs, all the way down to that record level transactional detail in sub second time, bout.

35:45
So, the other half of this, or the other part of this, is the, you know, the desire by organizations who have, other BI tools.

35:56
And they want a front end that, instead of using Incorta is, using Power BI and Tableau.

36:04
So, here’s a dashboard, Empowered that shows pretty much the exact same information. I’m going to toggle back and forth between these a little bit over.

36:13
I’ll give you all a headache, but what I want to quick show you is, I’m going to jump back to this guy. And I’m just, I just want to bring to your attention, started that, that business schema that I showed you. Right? So, I have Customer 360.

36:28
There’s that business friendly view that has, you know, 76 columns in it.

36:33
And if I wanted to, you know, attached to that again, I’ve got these 76 columns here and have descriptions, and I can go view all the states and, you know, all that kind of good stuff, right?

36:43
So, kind of take a mental picture of that.

36:45
And now I’m going to toggle over here, and if I show you the fields, here’s my Customer 360, Customer 360 with my 76 fields in it.

36:55
So, I’ve expose this business view, the metadata that I have in Power BI is one view, Right? So, so think about that for a second. Instead of having to pull in all these tables and do all the joins, and do it, again, in Power BI, put a bunch of logic in Power BI.

37:12
I put all that stuff into a business view.

37:14
And what that yields, for me is a dashboard where the numbers are identical two of the numbers in the Incorta dashboard. So, I’ll bring your attention to the revenue.

37:26
Across the board here, 222 million, 67,000, forecast an out, 1.05, 541 issues, then 46,960 individual transactions or, you know, this dataset at my top accounts by revenue, Staples, Costco, etc.

37:43
Geographical information here, you know, revenue by state.

37:48
Washington at $31 million.

37:50
By opportunities.

37:52
You see the lost, the one, the wind percentage, here’s the one that I have open, Here’s my tickets by customer, so I’m actually, you know, pulling information across those three systems.

38:01
So again, if I toggle back over to my original dashboard here, Sorry, I don’t know.

38:10
A much better way to do that.

38:13
You will notice that it’s loading very slowly.

38:19
You’ll notice that numbers are identical, two.

38:24
Numbers that are in that power.

38:26
Right, and so, your 2.2, 60, 7000, and 1.05 at 41, top five customers are the same, same opportunities are the same, and I have, at the bottom, 46,960 records, right?

38:42
So, the only difference really between these two is that I have the 46,960 in this dashboard, and I started to go and, you know, click on one, it filters down.

38:54
What I do here is I’ve actually done drill through.

38:57
So I click on this or I click on Costco, right? And you can see where it does the same thing.

39:03
It’s going to go in an update, all the metrics. And those are going to tie out to in My Incorta dashboard.

39:11
So, if I drill into Office Depot here, you see 28.84 or 28.8 for Florida, all that good stuff.

39:21
Then what I do is, I instead, Here’s I drill through, and that’s because Power BI is not going to, let me bring back more than one million records at a time.

39:30
So if I want to look at that detailed data, I’m actually going to go then drill through, and I get a filtered list of all of the transactions, right.

39:40
And that number ties out, by the way, to the revenue number, and I can look at all the individual tickets, and I can even then create links back.

39:48
Or I can even drill back into Incorta passing through maybe Office Depot, hide, whatever. I didn’t set that up in here.

39:56
But the point is, I’m able to, you know, really interact with that data and I’m doing the same thing here in Power BI and it tracks and ties and does the same thing all with a single business view off of those three systems.

40:13
And if I want to add another column to that, Great. Go for it. Push it out, and I’m able to use it.

40:18
That’s the last thing I’m going to show you, is, I think I covered everything off there, is, you know, you can do same thing in Tableau.

40:27
So, again, no, going back to any one of my individual visualizations.

40:33
Here’s my business view, right?

40:35
It’s my Customer 360, is that really highlighted for me here?

40:40
But you can see, these are my 76 fields that are the same ones.

40:44
And they’re used to populate these various metrics, and the 222 million, 67,105, 541 issues, 31 million in Washington, you get the idea, right. So I’m not going to run you through all of that here, but I can create those same visualizations.

41:02
So the organizations, you know, the departments, lines of business, whatever, partners, vendors, they can use that tool of choice that they want to use.

41:11
Because we’re just a Postgres database to, to that system.

41:17
So, if I wanted to actually go and create a new one, my credentials saved in here. I’ll see how quickly this goes this.

41:24
I’ve got a little off road here because you have a little bit of time, but to know that to do all this and create new content, I just open up a new Power BI instance and on here, sites are co-operating with me.

41:45
And I’m going to go get data.

41:47
And I have the credentials and everything already stored in here, and that’s how Power BI works.

41:52
I’m not going to do that because I don’t have this handy.

42:00
All I need to do is, if I want to create content, I’m going to say New, and I have to just flip a switch and Incorta, right? You have to, By default, you can’t, But each to the flip a switch, it says, Allow connection to other visualization tools.

42:14
Then, what Incorta does. It says, Oh, hey, you want to use Power BI?

42:17
Here’s your connection string information.

42:21
And, yeah, there’s not a demo instance, so it doesn’t have the actual connect string up there.

42:26
But it would say, you know, like, here’s my demo.encoded.cloud, whatever port 5436, you put that in the post gress data.

42:37
Fields, write the UI.

42:39
Wanted to direct Query. And it’ll say, hey, here’s all the here’s all the business schemas that are available to you.

42:44
It’ll show these to the extent that you have permissions on it. Right.

42:48
So if you’ve said, well, you know, only my marketing people can see Customer 360, then, know that, sync it up with your Active Directory or l-dap, or your through SAML, or whatever that is.

43:00
Then they’re able to easily see that in and in Power BI or Tableau or whatever, whatever that tool is.

43:11
So I’m not actually going to put that in yours, I don’t have the connection string handy.

43:14
But you get the idea, you just pull up again, and then you’re off to the races.

43:18
So, that was all I had to show you from a demonstration perspective.

43:23
So, to summarize, you know, we get that you’ve sort of maybe seen demonstrations of other BI tools.

43:33
And, but there are some really key differences that I really tried to highlight and, you know, hammer home, is that, yeah, how do you do these various tasks drill down?

43:44
We give you that transaction level detail, right?

43:46
You saw me pulling all the data, roll it up, drill all the way down to the record level detail to switch screens out the dump it out to Excel, or a different tool all in one place.

43:59
Everyone else pretty much limited to subsets. And, you know, I was just talking about rendering reports sub second from collecting, And they’re a little slower, usually over these web meetings.

44:08
It tends to be very, very quick across, and that’s Billions of Records.

44:13
Thousands of tables, several systems, right? And here it takes minutes, hours, days, or simply isn’t even possible.

44:22
For somebody who’s complex reports hollow stick to add no new business data. I didn’t do that here.

44:27
But, you know, to add a new source, you can imagine, I create a connector, Pull the data in two billion records in 40 minutes, smaller systems. You’d be super quick.

44:37
Otherwise, you could take weeks to rebuild the connections remodel.

44:40
It figure out, OK, how does it ripple downstream to everything else?

44:45
Then how do I know my accurate rip my results are accurate, because you don’t have to reshape the data. You’re I know it’s accurate. I can go ahead and see it if a number is off. I guess what’s comprising that number?

44:55
And, and fix it. I guess it’s actually brightest source system, right?

44:59
Somebody put it in the wrong place, then here, you know, versus, you’ve going to go back and comb through, sort of inscrutable and complex ETL at whatever stage in that process. So that, by definition, usually means error, exclude data, summarized data. You don’t know where that’s.

45:19
So, again, the sort of timelines to actually get things done here.

45:23
The traditional BI, EDW, I think this is sort of generous too is you do analysts stuff up your 618 months Corridor.

45:32
You’re talking days.

45:34
First, we’ll spend some more time doing QA and all that sort of stuff, but it’s really quick, because you’re, you’re not spending all your time doing all this other stuff.

45:42
So, the proof is in the pudding right.

45:45
So we’ve got some great logos here.

45:47
Starbuck’s went from 12 months to started analyzing a lot of their merchandising stuff and providing, you know, say offers on stuff that might expire for perishable goods to be able to add in in 60 days.

46:01
And they’re, you know, implement this project, Equinix will not possible to 44 days shutter flight three months. At 30 days. Broadcom not possible to 11 days.

46:10
They actually acquired Symantec Antivirus company and they integrated their systems with Broad comes through a quarter 11 minutes.

46:22
So, they were able to actually get a view across ERP systems across a newly acquired company.

46:27
They think about that as something that takes companies years, if they even bother doing it.

46:33
So, it’s really cool in terms of how you’re able to do that and get visibility.

46:37
So, if this sounded interesting to you, we, here’s a couple of other resources that are here.

46:46
And, by the way, you can pull this step down, as Steve mentioned, you can go and try it yourself. There’s a link here.

46:55
You can spin up a cloud instance, You pull in your own data of our data apps, and you can go join the Incorta community if you have questions, or want to contribute, or whatever, want to just learn about Incorta.

47:07
And in the learning, you can learn about using our free courses there, literally free e-learning courses. And there’s free, instructor led courses for both business users.

47:19
The business user one takes a few hours and self paced.

47:22
The you can get certified in Incorta ticket. This five day course. It’s a few hours a day.

47:28
Also, self paced Or there’s an instructor led one.

47:31
And there’s a full Admin course where you install an inquiry. You learn the whole shooting match. That’s instructor led, all free.

47:37
And if you want to go learn more, about a Incorta data apps, whatever, maybe sign up for our, we have a Data App Week coming up, which is, again, remember our blueprints for major systems like Oracle, NetSuite, etcetera, etcetera.

47:49
Some of the customers that I highlighted there will be actually speaking at that, showing the real business benefits that they get from are companies like Facebook, Comcast, and others.

48:03
So, definitely go there and check that out.

48:08
So, hopefully, that was handy. And, with that, I’m going to hand it back over to Steve for a little talk about how Senturus can help you and other things.

48:20
All right, Thank you, Mike, also, just a quick reminder for everybody that we are going to be doing Q&A here shortly. So, if you do have questions, go ahead and type them into the question panel. While I’m doing my thing here. We’ve got a few questions already that really add any additional questions you may have. And, as Mike said, remember, you can download the deck from our website, And you’ll also find some additional technical details, is a little bonus, and that deck that we weren’t able to cover in today’s webinars.

48:49
I think a little deeper into Incorta deck.

48:54
Few quick things about Senturus, number-one ways we can help you. Of course, we can help you when it comes to in Incorta, we can help you with the phased implementation strategy. We can help you migrate data sources over.

49:07
We can help you with integrating and Incorta into your existing analytics environment, regardless of the tools you use today.

49:13
Which, of course, for us, means often connecting to Power BI, Tableau or Cognos and we can help you out with dashboard and report development.

49:27
That, as I mentioned earlier, you can find additional resources on our website, not only the deck from today, but an enormous amount of knowledge sharing expertise for a couple of decades now, And you can find a whole bunch of it out there on our website.

49:42
So definitely check out the Senturus website or demos tech tips, and a bunch of other valuable information. A couple of upcoming events. one is the Incorta Data App Week, which Mike just mention. That’s coming up December sixth through ninth. You can register at the Incorta website.

50:02
And as Mike said, the data week, they’ll be focusing on some customer spotlights of how some of the biggest organizations out there are using Incorta to improve their time to fast analytics. We’ve also got another some terrific webinar coming up in December, using query folding to improve our performance will be Thursday, December 15th hosted by our own Pat Powers. And they’ll probably be here in the background. Also on that one, I hope to see you there.

50:35
Senturus, in general, just a quick bit about us. We concentrate on BI modernizations. and migrations across the entire stack. We’ve got a long, strong history of success. Many of you have worked with us for many years.

50:51
We’ve been in the industry for over 20 years, over 1300 clients and 3000 plus projects in here for a long time, we’ve got a dedicated team were large enough to meet all your business analytics needs were small enough to provide you personally.

51:12
So do reach out to us with any needs you have in your analytics environment.

51:18
We’re also hiring. So if you’re interested in joining the Senturus as team, we are looking for talented and experienced professionals. We’re currently looking to hire a consultant and a management consultant. So, if you’re interested in joining the team here at Senturus, reach out to us either by e-mail or on our website, to get more details, or to send us your resume.

51:42
With that, we’re going to go ahead and jump into the Q&A session, so I’m going to turn it over to you, Mike. I might type in here and there. It looks like we’ve got a few questions.

51:53
Yeah, I’ve actually been reading through those June. Thanks for, Jen. Thanks for being on. It’s good to see your name.

52:04
So your question was, why do you have access to raw data, ERP other ways to users, isn’t more trouble, and it’s worth giving access to the least to at least refine or clean data.

52:15
So, you know, you’re right in the sense that I think it depends on the users and what their use case is.

52:25
So remember what we’re talking about here in a lot of these things is our operational use cases.

52:32
So Imagine you’re in the finance department, and you’re trying to close the books and or you’re doing budgeting and forecasting and that.

52:40
you’re basing those numbers off of the data in your ERP system And you’re like, well, OK, why is this number not tie it out?

52:48
Or you know Why is this? You know? Why is this number like this?

52:51
Then you end up going and pulling up all that data and looking at the record level data, it’d be like, oh, OK, well, you know, we need to know what that stuff is. So, yeah.

53:02
Use it where you’re going to want to just show the aggregated or refined data and you can certainly do a lot about it in Incorta, right.

53:11
And a lot of cases it’s maybe just filtering out nulls or doing some fuzzy logic and creating buckets and things like that.

53:18
So it’s entirely up to you, what you expose, how you expose it.

53:23
If you do logic on it, you can certainly do refining and things like that within a Incorta.

53:28
But there’s a lot of cases you know if you’re doing uh, they’ve got a bill of materials, MRP run or something like that and it’s you know, so you’ve got some anomalies.

53:37
You want to go look at what is what fed that algorithm and figure out what was, you know, what was wrong in that assumption, or what went well went haywire. So there is a need to get down into that transactional level data. So we’re not sort of suggesting that you throw out all the raw data to every end user. It’s definitely going to depend on that.

53:57
But, know, when you pull it in, there’s operational aspects, where you’re going to want to really look at all that data.

54:03
And then, in a lot of cases, even if it’s for just analytics purposes, you end up having to troubleshoot a lot of that data, making sure. Is it refined? Is it cleaned, isn’t accurate?

54:14
Does it, does it tie out?

54:15
And then chasing that down through a bunch of ETL, this is really tough if you can do it at all, right? You know.

54:22
I did operational reporting with Sales Ops and Marketing Ops for about a year and a half width Senturus, a set of client and digging through all the Python code and SSIS has got SSIS code, Bachelor of Hadoop, back to Salesforce was a nearly impossible task.

54:42
So, you know, again, that lineage is sort of a sort of a nightmare.

54:47
And then, let’s see. So hopefully that answers your question in that.

54:52
Hey, Mike, I have a kind of a horizontally related question here. Well. It’s not exactly tangential that sounds or arguably, the GM anyway nerdiness there, folks.

55:16
So my question is, in terms of limiting detail access. So I can envision a situation where you might have a set of users who you want to only see aggregate data? But then, there’s some subset of users who you want to be able to drill through to the detail. and how like, in Incorta, would you handle that by doing like, separate business views for those types of users?

55:37
Or can you control kind of, like drill down permissions within a simple business view, just like, how just broadly, how would that be?

55:48
Yeah, that’s a great point, so you, could architected a bunch of different ways, but we do, know, so we have security at every level. You can do row level security in Incorta, using, you know, like session variables, IE, who’s logged in right now.

56:03
And it maps to a security table, and determines what they can see, or what level of detail, they can see, whether they can see the base schemas.

56:11
That raw data that I, showed you, whether they can pull that data in, and see that data, Whether they have access to the business views, or what business views they have access to.

56:20
So you might have, you know, greater level of summary, and some of those, and then, column level security, so, again, same sort of thing.

56:31
If they can see personal PI, type stuff, HIPAA information, things like that.

56:38
And we also have, you can use those, those roles and that security to do obfuscation of data.

56:44
So if you’re showing, like, Social security numbers, and you star out the first, you know, the First five, right, just show the last four, things like that.

56:54
So it’s all role based security, you know, when we integrate, again, with things like Okta and all that stuff, so what’s nice, again, is because of that that digital twin, a lot of the security that you might have mapped against that operational system, a transactional system.

57:13
If you’ve done that, you can pull that over and leverage it, write it in court, and without having to create a new set of structures that map to another data source, because the shape is the same.

57:28
Does that help?

57:29
That does, Thank you. That was awesome. Thanks, Mike. I’ve got a couple more questions of my own here, but I can ask you off, because we’ve got some questions for you over there, horizontal or vertical.

57:47
So, another question here is, What types of connections can power BI use to directly, to connect directly to it?

57:54
Incorta data, the current setup using Synapse, which is then accessed by Power BI?

58:00
Yeah, so, again, and that’s an interesting use case because we do a Incorta caveat data hub where we, again, free up the data from maybe Oracle Cloud, which is, it makes it really hard to get at the data. And you’re stuck with Oracle’s tools, and maybe you’re a Microsoft shop and you end up using Incorta, but then, they’re using things like Synapse.

58:21
So, we actually use our business views and what we call materialized views to shape that data into more of a star schema and push that.

58:32
And the Synapse which does that hit by Power BI?

58:35
So, that’s kind of a more Azure Stack type of approach.

58:39
The connections I used are more. It’s a Postgres connection, and we call it SQL on it.

58:45
And it’s it really just makes it Incorta business views and schemas.

58:49
Again, permission control Look like a Postgres database to those tools.

58:56
And so that’s really the type of connection that we support there for that and really any other tool. So, it doesn’t matter what you’re using.

59:04
Then, there’s another question, Do business views work similarly to Cognos framework manager models where joins between query subjects?

59:12
You use Cognos terminology Query subjects, for those of you who may not be Cognos conversant are, are basically queries, right?

59:20
So they usually, it’s a table might be a view or a combination of tables, they call a query subject.

59:26
So, where the joins between those query subjects are implicit and, therefore, do not need to be notified by user while building reports, And the answer is 100%. That’s exactly what it’s done for.

59:36
So the joins are handled at the schema level.

59:41
And the business view allows you to combine information, not only and view that in a very friendly way, where the joins are handled back at the schema level, not only within a system, but across systems.

59:55
And that’s the power, and because we’re able to have the direct data mapping, we can do those joins, dozens, hundreds, of joins, across hundreds of thousands of tables across multiple systems. With some, you know, with response.

1:00:08
Time in seconds, milliseconds.

1:00:22
We’re at the top of the hour, so I don’t want to keep everybody here, I think this is a great stopping point. Thank you, as always, for joining us here, and it’s good to be back hear your voice and Incorta. Yeah, great to hear your voice out there.

1:00:40
Yeah, For everyone who’s attending, thank you, for joining us today.

1:00:45
You can always reach out to us, of course, on the web, by e-mail, or by phone, if you’d like, talking to people. Also, you may have noticed in the chat window, Scott Felton posted access to his calendar the account. If you’d like to get some time on Scott’s calendar, talk more about Incorta. Feel free to do that. He’s always happy to hear from you.

1:01:06
And with that, thank you, everybody, for attending today’s webinar. We hope to see you again at a future Senturus webinar.

Connect with Senturus

Sign up to be notified about our upcoming events

Back to top