Tableau Prep: A Visualization’s BFF
Overview of Self-Service Data Prep Tool for Tableau
Tableau Prep lets you clean, structure and augment data before bringing it into Tableau. It makes transformation steps simple, repeatable and self-service.
If your data isn’t structured correctly beforehand, building the visualizations you need can be difficult…and sometimes impossible. Transforming, or prepping, data before use in Tableau makes a world of difference. However, the process is often a complex and arduous one. Typically, multiple people get involved (i.e. IT and/or DBAs) and then you run into unnecessary bottlenecks and inhibit the flow of analytics. Fortunately, Tableau Prep is here to save the day!
But it’s very different product than Tableau Desktop. So how do you get started and get comfortable with this must-have tool? In this webinar recording, we provide an overview and quick start to Tableau Prep.
You will learn
- Why you need Tableau Prep
- What are the licensing requirements
- What are the current capabilities
- What new features are slated to be coming soon
Tableau Prep helps you create clean, Tableau friendly data without having to rely on IT. Watch this video to learn about this important tool.
Trainer and Consultant
Patrick has 20 years of experience in data science, business intelligence and data analytics. He is a Tableau Certified Associate and a Cognos expert whose product experience goes back to version 6. He is also versed in Actuate, Hyperion and Business Objects. His certifications include multiple programming languages, including Java and C++, and database certification (MS SQL).
Greetings everyone and welcome to this latest installation of the Senturus Knowledge Series today. We’re pleased to be presenting to you on the topic of Tableau Prep a visualizations best friend forever. I will give you an overview of self-service data prep of the self-service data prep tool for Tableau before we get started a few housekeeping items.
You’ll notice the GoToWebinar control panel, you can minimize and or restore that using the orange bright orange button there. And while we have the microphones muted out of courtesy to our speakers. We encourage you to enter any questions through the question section in the control panel.
And while we do try to answer all of the questions live while the webinars in progress if we’re somehow unable to do that will cover it in either the Q&A section or via a written response document that we post on senturus.com. And of course the next question we always get is can I get a copy of the presentation and the answer is absolutely it will be posted along with the recording and the questions log.
On the Senturus website as senturus.com/, select the resources tab then resources library, or you can click the link that I just put in the chat window of the GotoWebinar control panel. Be sure to bookmark that site as it has tons of valuable content on a wide variety of business analytics topics.
Our agenda today will go through some brief introductions of our presenter. Well, then we’ll cover a Tableau Prep overview. We’ll do a quick Senturus overview. For those of you who may not be familiar with us. We’ll give you some great additional free resources and please make sure you stick around to the end for the aforementioned Q&A specifically on the next slide with regard to the Tableau Prep overview. We’re going to cover the components of Tableau Prep.
We will discuss why you need Tableau Prep while you might use it will demo its capabilities. And lastly. We’ll talk about some of the cool features that are coming down the road as well as licensing requirements for each component.
I’m pleased to be joined today by Mr. Patrick Powers, consultant and trainer at Senturus. He has over 20 years of experience in data science business intelligence and data analytics. He is also a Tableau certified associate and a Cognos expert whose product experience goes back to version 6. He is also versed in actuate Hyperion and business objects is certifications, including multiple programming languages, including Java and C++ and database certification in MySQL.
My name is Mike Weinhauer and I wear many hats around here. One of them, is I have the pleasure of hosting these webinars and I’m happy to be here with you today. So we always like to get a finger on the pulse of our audience. And so we have some polls. I got two of them for you here today. And the first one is how do you currently prepare your data? So if you like to interact with us and let us know do you use Tableau Prep currently?
Tableau desktop using the data source tab to use a third-party ETL tool like alter X for example, or talent or something like that, or is it done at the back end using Excel or database views or something different? So go ahead and get those votes in here will give you a few seconds to do that.
Got about half of you in here. So go ahead and make your selections.
All right. I’m going to close this one out and share the results back with you. So it looks like half of you do it at the back end and about a quarter use Tableau desktop. Only 5% are using Tableau Prep. I’m a little surprised at that number and then about 20% of using a third-party ETL tool. So that’s an interesting result. I don’t know that I would have predicted that then our second poll are what are your data challenges?
You can select all that apply here. This isn’t a single choice one is the data in the wrong format as your data come from multiple sources. You have to combine it. Do you find your data? Is it different levels of grain? It is your data not analysis friendly or does it take to long for you to make visualizations to go ahead and make your make your votes there.
Choose any and all that apply to your situation.
I got about half of you with your responses in.
The audience is a little sleepy today. Usually people are speedier on the switch.
All right. We’re up at about 2/3 get those last second votes in.
All right, close that out and share it and so the biggest issue is data coming from multiple sources not too surprising. They’re giving the myriad data sources. We find in our clients environments close to 60% and data’s not analysis friendly and then not half. He’s sitting around in the data’s in the wrong format and different levels of grain and then about a 30 looks like takes you along to make vizzes. Thank you for sharing your insights there always appreciate that.
I’m going to hide that and we’ll move on to the next slide in our deck here where we talk about the components of Tableau Prep. So today we’re focusing on Tableau Prep Builder, which is a self-service client-based ETL which stands for extract transform and load tool used to prepare data within the Tableau environment with Tableau Prep.
You create flows of repeatable and documented steps those can then be scheduled to run on intervals using Tableau server or Tableau online using Tableau prep conductor. So I think of Tableau Prep Builder as the desktop tool where you instantiate and create flows and do all the things that the Patrick’s going to talk about and show you and then Tableau Prep conductor is the server based option where you can publish those flows and manage and schedule them via Tableau server. And with that.
I’m going to hand the floor over to Patrick who’s going to take you through the rest of the presentation Patrick the floor’s yours. Thank you Mike. And I did notice there was a question that came in there. Some of you were unable to vote. It looks like because the browser interface didn’t allow for it, but that’s okay. We got some good information regardless, and I think that anybody else voting would have pretty much fallen into one of the buckets that we already saw as all of you just responded.
We need to prep data and we need to talk about why do you need to prep data? The easiest way to think about it is exactly what you see here meals just don’t magically happen. Do they be nice if they did, but even if you’re at home or eating out somebody’s in back there making that meal they’re prepping the ingredients somebody is slicing and dicing adding some seasoning assembling the ingredients. It doesn’t happen.
Is our beautiful fully cooked meal now? I know it’s nearly lunchtime for a lot of you so don’t leave on me and get hungry now that I’ve shown you food, but this is like a Tableau dashboard. It doesn’t just happen that beautiful wonderful dashboard with actions and blending and all those other great things didn’t just happen magically 90% of the work had to do with somebody preparing the underlying data.
And these days data is everywhere and many of you know that you can create a quick Tableau dashboard with just about anything any Excel file any CSV any database connection? We can have a meal but is it going to be a good meal? Is it going to be an accurate informational and up-to-date set of data?
These are the important things.
We don’t know if we have enough. We don’t know if we have good data versus bad data these days because there’s so much data coming at us. There’s a lot of noise. We have to filter through and this starts getting into the difference between good data and bad data and good data is an absolute requirement to making accurate projections, which in turn allows us to make better decisions.
And these days we see more and more how important that good data really is but that of course begs the question. What is good data? And what is bad data? Sometimes we don’t know sometimes we don’t see it until we expose it in one of our tools like Tableau desktop.
There’s some common issues again. Some of you said the one of these already.
But we have things like data in the wrong format. We’ve got data going across different months that different granular levels and we don’t know we have to aggregate these things together.
We need to do unions on data the bigger one coming from multiple sources HR data’s in one system expense data’s in another and then sitting on somebody’s desk top is your budget and forecast data and all that’s got to be added to your data and all that’s got to come out magically honest on a dashboard.
But then we get into granularity that budget data. Well that’s done at the quarterly level and your inventory. Well, that’s what the monthly level and sales will sales are at the daily level. How do we put all these together again doesn’t happen automatically for us. Then we’ve got some things that are a little less obvious things that sometimes aren’t as important to the technical folks, but definitely are important.
Our business users’ data needs to be analysis friendly if it’s coming from a transactional system. We’ve all seen those fields are 32 x underscore dollar sign F4, which of course as we all know is customer number. So how do we do that? How do we make it friendly for our users?
This is the kind of thing. We’ll do in a data prep. Then there’s the outliers and mistakes when we have data that’s going back years and years and years and when we’ve had ten different people building 10 different ETL jobs in 10 different data sources. It’s very easy to have things like see versus California versus C a DOT and how do you aggregate that? How do you put all that together?
Other regardless of the tool this is all data prep. This is all the types of things that have to happen before a business user before a report author before a dashboard maker can really put out usable accurate information. Okay, if there’s so many choices, what are they?
Well, a lot of you just said you use Tableau desktop and that’s great. But Tableau desktop is not quite fully preparation. It allows us to change data types and we can do some pivoting and blending and unions.
But there’s a cost to that if someone’s using Tableau desktop and they’re not publishing out there. They’re not sending their data source. They’re not uploaded to the server. Then those changes are happening sometimes for one user. They’re not happening across the enterprise, especially if we’re doing calculated fields.
We’ve got calculations being done in the desktop tool where they should really be happening upstream and that leads to consistency issues every single one of you can probably relate to the fact that if you were to ask 10 different people, what’s the formula for variance? You’re going to get 10 different answers.
One group needs it this way. Another group needs it this way. How do we make sure things are consistent? That’s the kind of stuff that needs to be done upstream or on the back end and there are great tools out there. There’s wonderful third-party ETL tools that have more features than Tableau prep. I’ll Terex Informatica.
But are you going to give those kinds of tools to your end-user Tableau desktop analyst these are complex technical tools. They’re expensive to license. A lot of times. They require Enterprise installation. They require being controlled and managed by an IT staff.
That’s a lot of overhead.
Okay. Well, what if we do it in the database, of course, you can get your dbas because your dbas don’t have anything else to do right? They can just go ahead and start making views and do extracts and everything else will just add it to the queue and then what happens when there’s a new requirement and that new requirement has to be in the dashboard for a presentation and oh, I don’t know three hours. It can be hard to rely on those kind of.
Versus outside your immediate need you’ve got business processes. You’ve got things that have to happen to get them in there.
And that brings us to Tableau Prep.
Tableau Prep because of some of these other limitations can always be a good choice for many different things. First off. Let’s start with the fact that it’s free. If you’ve got a Tableau Creator license that right there makes at least one of you happy.
It is free. It’s also a graphical tool. Most of our analysts aren’t going to sit down and write SQL. They’re not going to know how to write the code out to take care of writing their own extracts. Tableau prep gives us a graphical way to do it and it’s integrated with a tool. They’re already familiar with its integrated with the desktop tool and you’re going to see that I’m going to show you that as we get into.
To the demos that at any point as I’m building a flow I can say, you know, I wonder how this is going to look and I can bring it right into a tool that I already know that I already have and it’s right there.
Because of all these things because I can publish to my Tableau server. This allows us to have documented and repeatable information. All right, we can use these things over and over again and that’s a big deal. Right? We want to be able to use this more than once.
We’ve got the ability to preview our data. So whether we preview it inside of Tableau desktop or we preview it as a straight CSV file, we’ve got ways to preview it and we can see what gets included or excluded with things like joins unions etc. over here on the right. You see a sample of some of these types of steps whether it’s cleansing by removing.
We’re fixing data types renaming correcting date formats. Those are all the types of things we can do and now that you’re tired of listening to me talk and you actually want to see something.
What do you say we do a nice demo and I introduced you to Tableau Prep for this demo? I’m going to use some sample data that is provided by Tableau. We’re going to be looking at some best-selling book data, and we’re going to see things like unions and how to bring data together and just get a nice feel for the product. So here I’ve started out my local copy just for those of you who may be curious.
I’m on 20/126.96.36.199 that is the most recent version very recent as a matter of fact.
And this has addressed if any of you have had problems with Tableau Prep this does address some of the bugs that were out there as far as using unions with CSV. So I’m prepping that question before it ends up in the questions. So I’m going to go ahead and I’m going to connect to data just like we see in Tableau desktop. So there are questions out there. Can Tableau Prep connect to Tableau server and you see here that yes, it can publish.
To server, but it also connects to just about any other server type as well. But we absolutely can connect to a Tableau server for this purpose. I’m going to go ahead and I’m just going to use Excel because I’m boring and I’m old and it’s easy and I’m going to go to book data, and I’m going to use this file from to 2818.
Tableau Prep has read my source. It’s brought in all of the sheets now, whether it’s a database or whether it is Excel. It does refer to them as tables. So it just calls everything tables.
For every flow that we create the very first step is going to be an input step for that input step. I’m going to bring over mass market and here in my interface. It gives me the fields. It gives me an original field Navy if it’s changed and it shows me sample values. So I get to see that right out of the gate here the minute I bring that over.
What about these other six sheets well in this particular case, they all have the same information. They’re all tight field name sample, ex cetera rank info. They all have the same thing. So we’re going to do a union and we’re going to do a wild-card union. This is really nice because this allows me if the data changes it’s easy enough for me to include everything that’s in that Excel file.
I just tell it to use a wildcard pattern and I say hey include all sheets. Same thing a wildcard pattern.
And I hit apply. It has added two additional Fields. It’s added the table name or in this case the sheet name and it’s added the path. So this is the path from where it came from. These are the different sheet names now that we’ve done our Union Mass Market doesn’t work as a name anymore.
So I’m going to go ahead I’m going to right click that and I’m going to rename the step and I’m going to rename it to Feb 28. Well more descriptive there.
Okay, I could stop right here. I could take this and I can make it out put and I would at least have a union set of data to work from so that’s great.
But we need to clean it.
Again, these are the types of steps. I can add. I’m going to add a clean step.
Here. I see more information on my data. Some of it is being shown in a distribution value. I’ve got string values and then I’ve got the data itself down below.
See that down below there.
And we’re going to go ahead and we’re going to clean this up.
One thing to mention you can use this pain this info pane not just for looking at samples, but it’s actually really good for doing analysis to if you wonder hey, what’s my distribution of mass Market? Well Mass Market is making up 10% of your rows trade paperback fiction is making up 15% of my rose so I can use this.
For information and for analysis purposes as well. And all right. Let’s take a look down here. What’s going on? We’ve got an info pane. I’m going to make this card a little wider. So everybody can see it.
In the info pane. I see that I’ve got actually four pieces of information the title the author the price and the ISBN.
In order to actually use that in my other tools. I’m going to want to split that apart.
When I click on it notice that the bar above has changed. This is dynamic. It will change based on the type of card. I’ve got selected.
Different things that I can do with this one. We’re going to do a split.
If you look at this you may think oh, no, I’ve got to do a manual split because I’ve got pipe symbols. Nope, in this case.
It sees the dollar sign as a delimiter as well so I can do an automatic Split for This Look at that. I’ve got my four Fields each one clean each one split out. It gives it a name. That’s oh so exciting. So we do need to take a moment and rename these and I’m going to call them title.
Author price and ISBN Now I’ve got four clean fields, which means that in my clean step I can get rid of this info pane. I no longer need this field.
No, that does not change my source data. I’m not doing anything destructive. That’s a big thing to remember, you know, people get concerned and worried that hey, I don’t want to trust this to somebody who’s non-technical and they’re going to start messing things up.
Not a problem and it’s absolutely not a dumb question that just came through you can use a regular expression. You can use it in a calculated field so I can create a calculated field here and I could use regex if I wanted to instead of doing the automatic split.
I could have created a calculated field against info just like I would have done in Tableau or in any other Tableau desktop or any other type of programming language and Yes, I am watching the questions at the same time Yang if I miss one. It’s not an unintentional.
One thing we do see because our original field was a string. It did create for string splits.
We want to change the data type for price is not obviously a string.
And we can change that to a number decimal.
Notice that when I change it to a number it changes my view to distribution so I can see that ten dollars is 14% ten to twenty to fifty three percent etc. if I want to see the individual values.
Not a problem. I can change that right there and now I’m back to individual values and I see their distribution as well.
All right, we’ve got one file. We’ve got it clean.
But as most of you stated I’ve got data from different sources.
Let’s add some more data.
So I’m going to add a second Excel file, and I’m going to add the bestsellers from the week before.
The table names in this case are exactly the same that won’t always be the case obviously, but in this case, they’re exactly the same. So I need to make sure I am on the right one. It’s very easy to pick the wrong one in this case.
Once again, I’m going to start with an input step.
Mass Market notice that it made it a different color. It does cycle through different colors. I could to Catherine to answer your question there. I could split out last week and weeks on the list just like I did the last one because I do have that common delimiter there. So whether I did it automatically or I did it manually it would be very easy to clean that in that same step and split that as well and I can create.
Two Fields last week number of weeks on list, so that too can be handled right there in my clean step.
That is your right that only works when there’s partials and things like that and I may have to do more of a calculated field or something more advanced like was asked a moment ago, but I could still split this out again. I may need to do it more calculated and more technical than other things.
Let’s do the same thing. We did with the last one. Let’s do a wild-card union. I’m going to apply that and I’m going to rename this one to Feb 21.
We have two sets of data. We want to bring them together.
And I’m going to go ahead and I’m going to do a union so I’m going to take Feb 21 2008 it right on top of my Feb 28 notice.
I have two options Union or join in this case because they’re identical I can do a union and that automatically creates my union step.
I can see down in the info pane. Here’s what was brought in. Hey, here’s any fields. That didn’t match.
And here’s everything that I’ve got going on.
At this point if I wanted to I could preview this in Tableau desktop. I can add an output step right here.
So at any point in my flow, I can create an extract so I can create an extract at any stage. I can create an output step or I could preview it in Tableau desktop so I can see that metadata at any stage.
Well, hey, I don’t want this clean step going to waste. So I’m going to remove this step from Feb 28th.
And instead I’m going to add the Union to my clean step. So now I have three choices ad Union or join. I’m going to add it.
Now I’m doing my two inputs.
Having them Union and I’ve got them both going through my clean step.
At this point again if I want to see how this looks I could preview it in Tableau desktop.
Its goanna open up here.
And here I’ve got my sample I can see what’s going on. I can see if life is good.
I can see what I might want to clean up or continue to add or take away and I’ve got it right here and I could go ahead and I could extract this I could save it from here.
But what we really want to do is we want to create an output so that we can use this. Absolutely here’s this is my clean step and in my clean step I’ve split out info.
I split it out into four fields and it’s now doing that to both Feb 28 and Feb 21. So this is the clean step right here if I want to see what I needed to do.
It shows me what was applied to this. So I don’t need to split the info field here because it’s happening here at this clean step. And we see the actual calculated field that it’s creating. I see my renames. I see my change and notice that I could edit any of those steps.
Brings up my standard calculation editor. I can see exactly what was happening. And if I say, you know what I don’t like this step I can delete it or I can undo it.
So I’ve got some options here.
I’ve got some options.
And you’ll notice that as I step through this changes so up here still says info split but when I get all the way down here, I see that it’s applied all of my changes. This is a flow as well.
So now that I’m good with both of my sources.
I’m going to add my output step.
My output step gives me some different options. I can save it to a file.
Or I can publish it as a data source.
Now there’s a question on the clean stuff. Let me go back one. This doesn’t know automatically what to do. It doesn’t know the difference between clean or unclean that’s up to us to decide. These changes were our choice. It didn’t automatically do anything just to make sure that’s clear.
So here on my output step I can save to a file or I could publish it to my Tableau Data server. All right.
We’re going to save to a file.
Something that’s come up from a lot of folks who’ve taken the class already taken our Tableau prep class. Hey, I can’t change the name. That’s right. You actually have to click browse and when you click browse, then you can change the name.
And that’ll get reflected here in the name field.
We also can specify the location and the most important one the output type. Do we want it to go as a CSV hyper is obviously anything from 10 5 or later. TDE gives us the flexibility of using it and any version of Tableau desktop, but CSV gives us the most flexibility.
Not only can I use it in Tableau desktop, but if I needed to send this off to somebody else if I needed to use it in another application. I’ve got that available to me the final piece of the puzzle. I need to run the flow that will actually create my output file. There’s a run flow button down the bottom here.
There’s also a little run Icon up here in my diagram and my flow a gram they do the exact same thing. So whether I click here or I click here.
There it is. Now. I’ve got a CSV file I can go ahead now. I can save these steps. I can save it to a file which would document all of my different things or I could publish this to a server so that if you had the Tableau prep connector, you can schedule it you can set it up you could automate it. So this one’s on a regular basis.
Now in this directory wherever the heck I put it I’ve got a CSV file.
Do by Tableau Prep repository data sources. There’s my best sellers and there it is all ready to go.
So that’s an introduction to it. I’d like to show you one more demo. We’ve got some time here. I want to show you one more demo something that comes up for a lot of people.
Is that your data isn’t always nice and perfect and clean especially if it’s coming from Excel.
A lot of times data needs to be pivoted. So I want to do a pivot step and I’m going to show you.
Here, we’ve got our good old sample Superstore, but this is a sample super store that’s been.
Deconstructed if you will to keep our meal analogy going and you can create a schedule if you have the Tableau Prep connector. So that’s something you got to you got to also have the connector. This is Builder the desktop connector. Is your server piece.
Here we see our products and our products have actually been split off into separate sheets. So I’m going to bring over office supplies and I see that I have a column called office supplies.
Well if I want a union that to my technology or to my furniture I need to turn this into something else for those of you familiar with sample Superstore. This would be our category field. So I’m going to take this and I’m going to add a pivot step.
And here I can pivot on that office supplies column.
Which creates a pivot one name and a pivot one value? We’re going to change this to category.
We’re going to change this product name.
So I pivoted my data.
Sometimes I’m not going to have it so easy. Let’s take a look at technology phones here. I’ve got two instances that have to be pivoted technology and phones.
Not a problem. I can do a multi pivot. I can do a Pivot Group that lets me pivot two different things. So first will pivot on technology. That gives us our first one.
Then I can also pivot on the second one phones and I can turn around and I can create a calculated field off of this. So here’s my Pivot two values when I select that. I have the option of creating a calculated field.
And I’m just going to do a very a very simple field here. I’m going to call it subcategory to line up to my other one.
And this is nothing fancy.
I’m doing a straight if I only have a single value in there. So I’m just going to make it a very simple.
Subcategory phones and then I would probably add a clean step to get rid of some of these additional columns some of these additional names, but now I’ve got subcategory.
I’ve got category.
And I’ve got product name.
With that I can do a union and I can take all of these and I can bring them into a union. So if I take my Pivot to Pivot one.
There’s my union.
Got my office supplies. I’ve got technology. I’ve got product and I’ve got my sub categories including phones. So again, I can do an output step.
I have an output step.
Of products Of course, I would take this and I would join it to the rest to create your standard sample Superstore, which is what we do in our class. We break all these apart just to put them all back together again. I can preview this in desktop if I wanted to see how it looks.
Generated 6000 Rose for me.
There’s my category. There’s my sub category.
Of course, we see that we’ve got things that still need cleansing. We don’t have our any of our facts in here yet, but we were able to Pivot these into a single file that lets us see the data in the way the analyst needs to go ahead and actually do some work on it.
So that’s my demo for the day here before I switch it back to Mike. I want to talk about some of the things that we’ve seen things that are coming. Hopefully given the world situation that are coming soon. We’ve been a beta tester for both Tableau desktop and Tableau Prep since very early on even back when this was called project Maestro.
We also do a lot of ETL work for our clients and we know that this isn’t a one-size-fits-all and given that we are beta testers were able to see what’s coming and how close it is coming to be in a more full blown ETL type of solution. Okay in the next release, which is projected for summer.
All things being relative first is incorrect incremental data refresh if you have data that grows, and you don’t want to run the flow for all the data you can run it for new data. This is in the 20.22 beta where you can specify a field and only run the flow for that new data for that field.
Second big thing is the Salesforce data connector. Now that Tableau is part of the Salesforce family. It’s no surprise that it’s going to provide a direct connection. You still need to know how the data and the Salesforce tables connect to each other obviously, you know how opportunity and can account work. You got to be careful not to get the wrong data, but its one step closer to being a full solution for folks who are Salesforce folks.
At the conference last November there were some additional things. There’s no date set on these.
Because Salesforce and things like that or all web-based. It’s no surprise that will see a web-based version of Tableau Prep at some point.
We’ll probably expect to see that in a major release in 2021 or 2022, it all depends on how things are going. The other thing that was not given a date but was announced was the ability to write to a database. It’s great to build extracts and CSV. But what do you want to do when you want to write it back to a database sources of you?
That will be coming at some point. This makes it more like and I’ll Terex or another type of product that allows you to build new tables.
This is better management. We don’t have CSV files or extracts floating around. It’s all going back to a centralized database in a centralized data source.
I hope we see that one sooner rather than later, but I would say that’s probably also another major release probably looking 2021 or later. But these are exciting new things coming. It’s come a long way since its first introduction years and years ago. It feels like now it’s really turned into a great product that people can use of all skill sets. It’s doing some wonderful things for everybody. That’s all I have. I want to switch it back to Mike.
Talk about some of the licensing requirements and other stuff. Thanks Patrick. And we did see everyone please stick around and get your questions into the question panel. We have some good ones that are even though Patrick’s been kind enough to answer a lot of those in line. But several of them were about the licensing requirement and whether Tableau Prep is offered as a standalone and the answers of prep is not available as a standalone product.
It comes as part of the Tableau Creator license, so, since 2018 to have a licensing has been bucketed into three broad areas creating exploring and viewing and again Tableau Creator is the person who can create data sources.
So they use Tableau desktop and it’s bundled with Tableau Prep as well as a server license and you can use Tableau desktop or Tableau Prep to create data sources AKA metadata for Tableau the screenshot at the bottom there shows you Tableau conductor, which comes with the Data management add-on, which is currently priced at five dollars and fifty cents per user per month and without the ad on the prep flows need to be manually run. So those of you who are asking I knew there were some questions in the pain about connecting to Tableau server.
So you can connect to time as a data source and prep is Patrick pointed out and then the other sort of I would say connection point to Tableau server is the ability to publish flows from desktop much like you publish data source or workbook from Tableau desktop and then you can manage that in run that and schedule it through Tableau server. So if you need more information about that, we do have a great blog that’s listed at the bottom of that slide and we are at a blog interest as a Tableau reseller so we can as well as a services provider and a technology partner we can help you with all that if you have questions about it or need help with your Tableau licensing on the next slide.
Pleased to announce that we have a new Tableau Prep class that’ll be an instructor-led online. I believe led by the author of the class on our presenter today. Mr. Patrick Powers.
So if you want to hear more of his dulcet voice and wise musings, please join us for that and that were pleased to add that class to our comprehensive lineup of Tableau classes from everything from fundamentals all the way up to expert and we have another fabulous new class called complex aggregations that focuses on table calculations and level of detail calculations all taught by our world class instructors. So please visit us at the link below there to check out that class and others again get those questions into the question pane and we’ll get to those in just a few more slides quick few slides about Senturus. We are the authority in business intelligence all we do all day every day is business intelligence with a depth of knowledge across the entire bi stack whether that’s planning design implementation.
Of data sources data lakes creation of BI content upgrades migrations performance optimization and tuning through training and project management.
On the next slide ours our clients know us for providing clarity from the chaos of complex business requirements disparate data sources and constantly moving targets. We’ve made a name for ourselves based on our strengths and Bridging the Gap between it and business users.
We deliver solutions that give you access to reliable analysis ready data across the organization so you can quickly and easily get answers at the point of impact in the form of the decisions you make and the actions you take. Our Consultants are leading experts in the field of analytics with years of pragmatic real-world expertise and experience advancing the state of the art. You can see some of the broad areas that we support there we are so confident in our team and methodology that we back our projects with a 100% money back guarantee that is unique in the industry. And then lastly we’ve been doing this for a while coming up on two decades. We’ve been focused exclusively on business intelligence.
We work across the spectrum from everything from Fortune 500 to mid-markets, you’ll almost certainly recognize many of those nameplates. They’re solving business problems across different Industries and functional areas from ranging from the office of finance to sales and marketing manufacturing operations. HR and IT our team here at senturus is both large enough to meet your business objectives and yet small enough to provide personalized attention a couple of great free assets here that you should take a look at and potentially bookmark.
We have hundreds of Resources on our website ranging from webinars such as this one on all things BI to our fabulous up-to-the-minute easily consumable blogs, and we’ve got on the next slide a couple of great upcoming events. You should definitely take a look at first one being Power BI row-level security. That’s going to be Thursday May 21st, all of our webinars pretty much fall on Thursdays between 11 and 12 Pacific. We have another one the week after that on accelerating bi migrations. I’ll be running that one talking about some of our migration.
Technologies and then the last one that we have on the in the in the near term is Cognos framework manager versus data modules. We’re doing a really fascinating comparison of those two technologies.
So go ahead to events and sign up for that and I’d be remiss if I didn’t mention our complete BI training we provide training in the top three major be high platforms IBM Cognos Analytics, Power BI and Tableau. Ideal for organizations, especially multiples of those platforms or those moving from one to the other based upon the breadth and depth of knowledge that our trainers have in multiple platforms. We offer tailored group sessions one to one or one too few mentoring instructor-led online courses as well as self-paced e-learning so we can provide those trainings and many modes. It can mix and match to suit your business needs and lastly some additional resources here. We provide hundreds of free resources on our website. We’ve been committed to sharing our box.
BI expertise for over a decade for more on this topic. We’ve got some great links in the deck here again. This will find this up on senturus.com where you can find information about Tableau licensing and some other great topics there and then that brings us to the QA so can go ahead and continue to get your questions in there Patrick. I don’t know if you got a chance to kind of look through those and you did answer a lot of them sort of in line.
There’s one that I noticed here where a gentleman asked about modularizing some of the steps in a flow so they can be reused to have any comments on that one. Well any step can be can be saved and documented along the way so you can reuse it.
If you have the conductor piece the connector piece that’s where you’re going to be able to really save these things as you’re going to publish them up to your Tableau server use the connector one question Mike that you might want to be able to answer was on the Senturus Analytics Connector.
So you have Cognos a mini govern data sources cubes framework packages Dynamic cubes. Can you explain what data sources the Senturus Analytics Connector will allow us to connect to? Thanks Trevor. We do offer a product that he mentions here called the Senturus Analytics Connector.
And that is a tool that specifically allows Tableau and Power BI users to access Cognos data sources and metadata, so Framework Manager packages and or data modules and or reports through those tools, so you get reuse of all that great metadata and data assets that you have in Cognos with where you’ve got security modeled and all that sort of stuff. It just gets passed through the data sources that you can use that Cognos accesses. So you mentioned power-play keeps framework packages Dynamic cubes. Those are all supported sources.
So we support relational sources are plea cubes Dynamic cubes dimensionally modeled relational as well as tm1. The only ones we really don’t support are things like Hyperion and SASS cubes.
So if you’re interested in that, it’s a great if you’re familiar with the Gartner term bimodal BI where you have kind of a coexistence strategy if you will between tools like Cognos and or Tableau and or Power BI the connector might be a good solution for you to leverage all that great metadata.
One more, I’ll now take care of real quick. Here’s from Sonya on adding more data to an Excel sheet. So yes, if you don’t have the connector piece, you would have to rerun the flow when the new data comes in, but that’s the big advantage of having the connector connected up with your server is that you can schedule these flows.
So you wouldn’t need to worry about republishing it all becomes automatic if Excel updates of the file updates, if your database updates that all happens automatically as part of the tool the standalone prep builder from confirm connector there. I would have to actually rerun the flow to get the new data into my output file.
And I think Mike we actually got almost all the questions. Yeah, it looks like it and I saw the one question about how do we document the actions in the clean steps? And I know you talked about the Tableau Prep does a great job in that. When you click on the step there, right? You can see what all the different steps were in terms of. Yeah. There you go. So you can output that save that to a file so there is some capability there.
So I can publish this out. I can even change the color if I really wanted to but I could save this out to a file and have all those steps. And again, we see the steps whenever we’ve got one. Let me switch to the other one.
I can see right here exactly what happened. I can see everything and so it’s all depending on what I want to do with it. If I want to save these steps as a flow in a file if I want to add a description here I can so I could documented by simply copying and pasting by adding a description. So there’s a lot of different ways to take care of them. And then I’m not sure.
Do you know if you know Tableau has introduced a lot of tools and because they were they along Tools like Power BI were lagging in areas of Cognos is very strong and in terms of lineage and governance, so that’s been a big push for companies like Tableau.
And so they’ve introduced Tableau catalog which is kind of lineage functionality and I’m not sure maybe you are Patrick, but what the hooks are in between that like if you publish a flow does that provide a rich set of information to catalog or is that I don’t know. I don’t have it. Yeah, I’m not sure either so I would expect those hooks if they don’t exist already to be better, but you get better visualization or insight into where did that data come from or of course you can certify data sets and stuff like that. So those are things that Tableau added in there that are nice.
So you don’t kind of have the wild west of everybody publishing random data sets out there on Tableau server, and then there’s a t Tableau explain which I actually go in and add some Kind of English descriptions of what you’re looking at in your visualizations. And so I think having some of that metadata helps with that.
So, let’s see. Just kind of scrolling through Patrick. You got some fans on there saying hi. I think those are all the questions that we had for today. So unless anything pops online here. First of all, I want to send a sincere thank you to our presenter Patrick today who picked up the ball and ran with the Prep demonstration today. It was certainly very informative and interesting when I think all of you for joining us today, and we look forward to seeing you soon on.
Another one of our knowledge series events. Thank you all and have a great rest of your day right now.