Our Favorite New Features in Tableau 2019.3

Hey there Tableau fans! Tableau 2019.3 gave Tableau a much-needed boost to its data and Server management aspects. Improvements included: Explain Data, which uses the AI capabilities to let you easily tap into the why behind your data. Tableau Catalog provides lineage and use statistics of your data. Tableau Server administrators also got a bunch of new toys with improved performance and server monitoring tools, promotion capabilities and at rest security encryption settings.

In this on-demand webinar we review and walk through these updates and give you insight into our favorite new features. The topics covered include

  • Explain Data
  • Tableau Catalog
  • Encryptions at rest
  • PDF attachment subscriptions
  • Tableau Server management
  • Resource monitoring
  • Deployments
  • External repository
  • Workload optimization

Presenter

Todd Schuman
Practice Lead – Installations, Upgrades and Performance Tuning
Senturus, Inc.

Todd is a dyed-in-the-wool data nerd with 17+ years of analytics experience crossing multiple industries. BI tool multi-lingual, Todd has applied his Cognos, Tableau and Power BI expertise in many industries. His diverse subject matter expertise means he wears many hats at Senturus. Todd takes turns running our Install, Upgrade and Performance practice for Senturus, developing BI reports and dashboards, mentoring clients on BI best practices, and training the next generation of BI users.

Machine transcript

0:07
Greetings everyone and welcome to this latest installment in the Senturus knowledge Series today. We’re excited to bring you the topic of our favorite new features in Tableau 20 19.3.

0:19
This is arguably one of their biggest releases in recent memory where data and server Management in particular get a big boost first some housekeeping items those of you joining us through GoToWebinar here can minimize and or restore the control panel via the orange arrow at the top and while we have all the microphones muted. We do encourage you to submit your questions via the questions pane. We do try to answer all the questions relevant questions.

0:52
Anyway, during the webinar live anything we’re unable to answer for whatever reason we will respond via a complete question log document that we post on senturus.com along with a the deck and the recording of the webinar Which brings us to the question we invariably get in the beginning and throughout the presentation. Can I get the presentation slide deck and the answer is an unqualified? Yes you had on up to the URL. You see their sinteres.com resources. You can find the deck today the question log as I mentioned before and we’ll post the video of this in a few weeks. We highly recommend that you bookmark this site as it’s an extensive Free Library of all our past.

1:44
Webinars demos white papers presentations and helpful hints. It’s a great free resource for you to come back to frequently.

1:53
Our agenda today after some quick introductions. I’ll hand it over to our presenter to cover some of the key features in 2019 three including explain data the Tableau catalog at rest encryption of extracts PDF attachment subscriptions. I’m going to a dive into the new Tableau resource Management functionality including resource monitoring deployments external repository and workload optimization. And then since the release schedule for these products tends to be fairly fast.

2:23
Paste we got into 20 19.4 and we’ll give you a little bit of a peek at the roadmap there. Well before we talk about some additional resources and get into Q&A.

2:37
I’m pleased to be joined today by touch Schumann who heads up our practice on installations upgrades and optimizations, and my name is Mike wine how our I’m a practice area director here in Solutions architect at Senturus pleased to be hosting today’s days webinar And so we always try to get a finger on the pulse of our attendees today and our first poll today that we’d like your input on is what version of Tableau server are you currently running? So obviously you can only choose one of those in this poll. So please pick one get your votes in. We usually let this run till we get about 75 80 percent of people or 60 seconds passes.

3:19
So go ahead and weigh in democracy in action Can we got about two-thirds people here will give people a few more seconds to weigh and get those votes in.

3:45
People sit on the sidelines today. They don’t want to vote. All right, we’ll close it out and share we got about two-thirds there. We are showing about two-thirds running some version of 2019 .X. There’s about 15 percent still however none 2018 to and a chunk about 20% somewhere below that’s a interesting to see and I guess good in that.

4:09
We kind of see a lot of people sitting back on 2018 that one because of the introduction of SM in 2018 to Tableau service manager, which is a tricky upgrade not for the faint of heart. And then we have one more poll here that we’re going to put up for you. And that’s asking what are the biggest challenges that you are finding in your Tableau environment. So we’ll launch that one and you can this is a select all that apply here. So is it setting up trusted authentication and some of the some of the challenges around that?

4:44
That is it is a Performance challenges. Is it enabling self-service security enabling data governance, is that the installation and upgrades or is your biggest challenge user adoption?

4:59
So I’ll let this thing go for a little bit about half of y’all waiting in so far. So go ahead and make your selections.

5:14
Let this go a few more seconds. We’ve are almost at 2/3 here.

5:23
All right, I’ll close it out and share the results. So yeah, not too surprising there that user adoption is leading the packet close to 60 percent are still relatively low adoption rate for bi projects in general. It’s a really tough hurdle and then self-service security and data governance. We hear a lot of that in kind of mode to you know, Tableau etc. type environments and then kind of a smattering of other.

5:52
Other issues around performance and things like that. So great insights. Thank you for weighing in and letting us know where you stand in your organization. And with that I think Todd you can advance a couple slides here.

6:08
And I’ll hand the floor over to you to get into the meat of the presentation. Great. Thank you, Mike. Okay got a lot of topics to cover today is so I’m going to kind of just jump right in here with some of the new features that I was excited to see when they rolled out 20 19.3 a couple months ago first one being explained data.

6:29
So this is a new feature that is very easy to use and can really help you kind of dig into your data kind of answers that That age was question why though, you know, you’re looking at some of these visualizations they’re beautiful sometimes and you can you know, really do a lot of cool things with Tableau, but sometimes you’re just staring at this data and you’re just trying to figure out what is this mean? Why is this the way it is? So this new feature explain data, it adds additional insights into your visualizations it can then from that Insight provide sort of one-off additional worksheets and dashboards base.

7:08
On what you’re seeing and you can kind of dive into that and look into that and if that’s got more insights on it, you can kind of keep going further and further down the Wormhole. So you’re going to show you know, what exactly that looks like. These are a couple samples from Tableau the top the top two charts the first one the left being, you know, here’s some I think it’s a bike rental company and you see there’s a there’s a spike in August and you’re saying hey, why is that? Why is that so high? What will cause that?

7:38
You can click on it and it’s just a little light bulb icon when you touch it as the insights and when you click it, it brings up a little pop-up window that you see on the right and it says hey this number was higher. Let’s take a look at the weather conditions and you can see that you know, the weather was you know, we didn’t have any rain or snow it was overcast wasn’t too hot. So it’s a good time to go ride a bike. So it makes sense and it just, you know kind of pulls that information. So this is only going to be as good as your data sets. So if you’ve got really rich data sets that have lots of different.

8:08
Aunt indicators and in additional information in there you can really get a lot out of this if you have just a pretty basic, you know CSV file with some dates and products and numbers you may not get so much out of this. So it’s you know, usually the phrase garbage in garbage out Plies here. If you’ve got a lot of good stuff in here, you’ll probably get a lot more insights out of it second one. You don’t have a type of insight it provided out of the box, you know without really having to tell it what to look for is again that same sort of bike data putting it on a map.

8:37
I saw this big red Art, and this is looking at average rental times. You know, how long did people typically rent a bike and there’s this huge Red Dot and I said, hey, why is that so large you click on her and there appears to be just one rental in there that was six hundred and Nineteen Minutes, which is much higher than the usual and it said hey, if you remove that that’s going to adjust the average from 619 to 155, which is a more realistic estimate. It matches pretty much all the other numbers.

9:04
So it recommends removing that excluding that from your data set to get the numbers to skew better. So just some nice little features that are baked in and just clicking on a little light bulb will kind of do this stuff for you. And again as I mentioned only as good as your data, so make sure you know, your data has everything in it that you can think of and possibly even some things you don’t think would have relevance just kind of throw it all in there and see what you get and if you’re you know looking into to get some more insights out of this data.

9:33
Hopefully Tableau can find something that maybe you weren’t aware of it were thinking of so, Is a new feature that they are already building upon we have just the sort of a quick slide at the end but tells 19.4 they’ve continued to build on this so we’ll continue to follow it and show, you know new features and new things you can do with it. So it’s a little bit, you know basic and premature right now, but I expect to see this product and this this new feature continue to get built upon in the future releases.

10:01
Another new feature that they added is the Tableau catalog. This is a feature that is sort of like an indexing type service that you can run through your Tableau environment. It’s going to go through and sort of catalog and crude metadata about all your workbooks. Your data sources your worksheets your flows and it’s going to map all that back to the source cable. So you’ll get things like lineage and you’ll get information about the database tables and where they came from.

10:30
Who created them if there’s any issues or things you need to be concerned about the data? So it just provides a whole lot of information on behind the scenes. You’re really know sometimes where this data was coming from. You got a data source that was published or you published it and people didn’t know much about it with this new catalog feature you’re able to kind of expose that back-end information and allow people to see it and the good news is very easy to turn on.

10:56
It’s just a command line function you get if you’re on Has 19 you’ve already kind of converted to the TSM as Mike refer to so the tab. Admin features are gone everything sort of TSM base. Now. You just need to enable maintenance metadata Services unable and then you can go ahead and use this.

11:16
One thing to be aware of is this is a very labor-intensive time-consuming process to run Tableau provide sort of a benchmark based on You Know sample size to kind of help you figure out what you’re looking at and there’s was you know, assuming you have about 12,000 total workbooks data sources flows in your environment.

11:37
And again, this this runs what sort of a default set of threads in Ram, they’re using two threads in 64 gigs of RAM, it took six hours to Index and catalog this entire 12,000 workbooks etcetera. So again when this is running I sort of included a screenshot from the top there what happens when you turn it on it is even going to say yes. Some of the functionality is going to be unavailable. It’s going to really slow things down. So you want to run this in an off peak time. If you have the resources, you can allocate additional threads and ram2 to kind of throw more at this to get it done quicker and try to try to figure out you know, when one’s a good thing, too.

12:15
Is don’t kick this off at 9 a.m. Monday morning, you’re going to get a lot of phone calls of people complaining about performance issues and things like that. So find a good time to do this off-hours, see how much additional resources you can throw it and try to get that debt ingestion number down almost to a reasonable number if you’ve got more than 12,000 you’re really going to kind of, you know, figure out you know, what the best way to break this up is and get it through, you know, poor people start coming back into the system.

12:43
So definitely don’t just kick this us off in the morning and walk away because it’s going to cause a lot of problems it is like I said a very labor time-consuming type process.

12:55
One of the things about the catalog is this is sort of a new license specific. They have new licenses that you need to use. This is going to be tied to the data management add-on license. So if you don’t have the catalog available and you’re wondering why I don’t see this information, you need to go into your licenses and see if you have this check box. Like I showed their kind of highlighted in the red box.

13:16
If you don’t have a check box that data management add-on either need to enter your key or need to purchase a key to have that capability and Going to get into you know, some of the other features that are tied to the other one there the server management at on these are sort of new things that you need to kind of get set up and to be able to use so obviously these things won’t work if you don’t have them, so just check here first. If you don’t see you think you should have it. It’s not working. Make sure you have check boxes there one of the things to check to if you don’t see it is under the new settings under General sort of the middle towards the end of the list there. There’s a section for Tableau catalog.

13:53
You can you can still turn this on but you can actually uncheck it to not allow users to see if you want to. So again, if you do ran it and you want to disable it or you want to take it away from people being a to see that data for some reason you can just uncheck that box. So if you’re looking to see it, it’s not there. Make sure the box is checked. Make sure it’s visible to users. I think I was messing around with it when point I didn’t see it and I didn’t realize that I had it unchecked so just save yourself some time and check these two areas first if you don’t see it.

14:22
At rest encryption is another huge feature. This has been something that has been missing in my opinion for several years. Now, you know, you can go through several different things to sort of secure your Tableau server environment. But one thing that was sort of always a little bit of a risk was the extracts themselves live on the server.

14:45
So your database could have, you know Security in it, and you may have security that you log into Tableau server with but once those extracts are running that stuff is all getting pulled out of those secure tools and their Now sort of in this file at just sitting on the server. So that data was never incredible in the past with this new version administrators now can enforce the level of encryption on these extracts. So you have three options and depending on you know, the data you work within your user base, you know, this may not be a big issue, but you can continue to just disabled.

15:22
I have no security at all. You can enable it to allow the users who are creating these extracts and running the extracts to have the responsibility to say. Hey, this is something I think is secure I should secure it this extract. I’m ready now doesn’t need to be secured. Let them do it, or you can be the complete micromanaging person which in some cases you might want to be especially if you know your data and know your users aren’t going to remember this just enforce it right from the get-go all extract.

15:52
Published in stored are now encrypted. So again something that’s going to vary from customer to customer. But if there’s any risk and you know, there’s a chance that your users might not do it. You may want to just enforce it completely. So not something you could do in the past are very welcome feature for administrators. And one thing to know this is the .hyper extracts only.

16:14
So if you have you know, we talked about people still have an upgraded to so the later versions of Tableau if you’re Using TDE files you’re going to have to convert them and upgrade them to a hyper file before they can be encrypted.

16:30
You might even get better performance with Hyper, so there’s not really any reason other than just being tied to an older version not to upgrade to hyper at this point other things that you can’t encrypted a are some things like temp files cache files the workbooks and data sources themselves are not able to be encrypted if you’ve got Excel or JSON files unless you Extract them into a high-profile. They will not be encrypted. And obviously this one is always would have been true, you know anything you download from the server to your local machine, like if you downloaded an extract that is no longer being corrected, that’s no longer attached to the server can’t be encrypted. So it’s on the local user’s machine. So if you do load do allow downloads just make sure users are aware of that.

17:19
PDF attachment subscriptions was another one. This was a request we got often and there was sort of a hack way to do this in the past and it was sort of a pain and I was always surprised that this wasn’t something that was just built into Tableau. There’s just a lot of things you can you can already do with this. This one in example just was not there for some reason so they finally introduced it in 20 19.3.

17:45
So now you can do Ends and you know sort of map them to parameterize reports and then have them come out as PDF so you can go and have it deliver out individual customized reports for each recipient, which is something that you know, a lot of bi tools have had for a long time. So it’s very nice that we can find the do that and it kind of has a new Wizard when you’re creating new subscriptions. You can see here, you know, you can pick image PDF image and PDF put in a message in there, you know.

18:18
This at the frequency of the delivery. If you have existing subscriptions, you can go in now and just change the format and a sort of bring up the same screen and then you can change it from the image you had before into PDF. So we’re image in PDF. So again just a little thing that you know, it was my opinion long overdue and very welcome to see in the new environment here. So not that much to say about that, but just, you know, welcome long overdue feature here.

18:49
The real meat of the between 1993 upgrade rise around this Tableau Resource Management tool, which we’re going to kind of get into this a lot of features in here as well. What this does is it allows you to sort of really get sort of Next Level insights into your servers and your environment and kind of be proactive about what’s going on in Tableau.

19:15
We saw based on the poll that there was a lot of Concern is a good 40 or 50 percent. I think I saw on the poles of people who had performance issues. These type of tools are now sort of provided out of the box with Tableau and you could do some of the stuff in the past with the repository and building reports on top of that. It had some you know, as the Box status reports that you could look at load times and extract refresh times. I think like that and now it’s sort of all in one place. It’s got its own dedicated Repository.

19:48
And it comes with you know, its own sort of tool to kind of tap around and see things in a much more user-friendly way. So look at that and we’re also going to look at some of the things that provides insights to like slowly which views are running the slow is which extracts are there are taking the longest you don’t have to kind of died down and dig through all the data. It’s going to expose and Bubbles up to the top for you.

20:09
This is just sort of one of the one of the screens of you know, what you can get but things like performance the server. You know, how are the processors doing? How is the ram the disk you the network speeds concurrent user load? Is it getting hammered at certain times a day now? It’s very easy to see, you know, just throw on this this line chart here. You know, when is it? When is it spiking the Tableau process? You could set up individual monitoring for the backgrounders the VIS ql workers, you know all the different processes you have.

20:39
I have as well as background tasking a seeing what’s running behind the scenes the insights I mentioned is going to just I pasted the bottom there, but it’ll show you. Hey, do you five reports here? These are the slowest views and how long they take classes you kind of go back and say there’s nothing I can do to speed this thing up and sitting with your extracts, you know, you can see that there’s an employee extract takes about 51 seconds, which grand scheme of things isn’t too bad.

21:02
I’ve seen much worse, but anything you can do to kind of Target these ones that are slowing things down is going to make your users happier. It’s going to free up resources for other users as well. So just sort of, you know, the end result of what you get sort of setting up this new tool and how the tool works is similar to some other monitoring tools. You may have had experience with in the past. What it does it actually does is you need to stand up a master server which sort of collects the information and has the presentation tools that are going to kind of show those dashboards that I showed here.

21:40
And then each server you want to monitor needs to have this agent running on it and those agents are going to Transfer data to this master server from the log files from the Tableau repository from the Apis. It’s going to kind of pass all this information back to the main server, which now I mentioned has its own PostgreSQL database. So again, the big difference here is that, you know, we used to just report directly off the Tableau server repository. Now that data is being piped into its own PostgreSQL database on this master server. So we’re going to have, you know, very specific.

22:17
Here, we don’t have all the other stuff that we don’t necessarily need. It’s going to be just about Hardware usage and performance metrics and just the data structures themselves. So a little bit of a break from what we’ve had up until now there are certain three steps to get this thing running. It’s a bit of a beast CPU wise.

22:39
So the master server itself the recommendations from Tableau is eight CPUs and 32 gigs of RAM which is a lot. So I’m expecting that in this is sort of new we haven’t had a whole bunch of time to Benchmark and see you know, what the performance and things look like on a lesser server, but these are the specs from Tableau. So either in a blog post or a future update. I’ll try to get some more information about you know, if that’s really necessary or not because it does seem like a pretty high number there, but we’re using that for now the prerequisites they asked actually three installation files if you want.

23:17
Uses there’s a prerequisite file a master file on Ancient file the prerequisites. It’s just additional software that you most likely don’t have today. It uses the messaging service called rabbitmq and the total erlang and then the post SQL database most Tableau users. You know when you’re installing Tableau it installs that for you. So is that of having to go through install all these things on your own you just run the Tableau prerequisite installation file. It does all that stuff for you. It makes it very quick and easy, too.

23:47
Kind of get all that stuff checked off so you can get installed on the other pieces, which is the master which again is the sort of the central repository for this resource monitoring tool and then the agents themselves, which is a pretty small. I think it’s like a maybe a fifty or a hundred megabyte file, which is pretty small these days as far as software and it’s pretty quick to install those agents.

24:11
The prerequisites at the very end of installing it it’s going to pop up this text box and it’s going to give you this sort of scary Inspector Gadget type message that please review this information and store it in a secure password manager. This file will be deleted automatically when closed so reminds of the old Inspector Gadget this message will self-destruct and it’s all in caps. So it’s very serious. But these are all your user IDs and passwords that you’re going to need to kind of set this thing up.

24:39
Up, and it’s actually true if you close out that text box without saving it taking the screenshot or copying and pasting it somewhere. It’s pretty much gone. So make sure you copy it down you get this information and save it somewhere because otherwise you’re going to have a lot of trouble getting this thing up and running. If you lose the initial setup here. The good news is once you’ve sort of done that and you run the master it’s going to sort of plug in a lot of these information this information for you.

25:10
Ethically so if you look at the previous screen and had, you know, rabbitmq user ID and password PostgreSQL, super user username and password interact between password read-only username/password it takes a lot of information and plugs it in. So the server gets populated the database gets populated the message queue gets populated most of these being localhost and just generic ports and values, but if you don’t have this information, you’re going to have to you know, manually plug it in and if you’re ready to come back here and modify it. Hopefully you have that file.

25:39
I’ll save somewhere so you can know what it is to change. So again, it’s a bit of a bear if you don’t have these. But again, if you kind of go through the steps, it makes it pretty easy and plugs and most of the stuff for you and then finally once you have the master installed, you’ll get the window on the left here the pop-up and it has the ability to kind of download a bootstrap file. This can then be imported on all the agents and it will take all the configuration.

26:09
The agents need to transmit to the master. So again, the agent installation is very quick. It’s a very small file and basically as soon as you get done it brings up that screen on the right and it’s registering that server as a specific agent and it says hey, do you have the bootstrap file from the master? If you do just browse to it and load it in and it’ll configure everything for you. If not next to that import but there’s a skip button you can manually plug that in and it’s going to ask you for a lot of information like the server you are.

26:39
URL it’s going to need this certificate thumbprint this really long code up here. I find sometimes working with multiple servers. It doesn’t like to copy and paste sometimes so that’s just a brutal code to have to manually fat finger in and most like you’re plugging a typo. So just use the bootstrap file and import it will make your life much easier.

27:02
And then once you have all that done, you’ll be able to kind of, you know, run the agent tool and get you know these References and information again, there’s a whole bunch. I don’t have the tabs and fortunately on here for some reason, but you can you can look at it individual server by server as an environment as a whole and to sort of Target, you know, where are the issues and the Peaks that you’re seeing that are kind of causing performance slowdowns and try to figure out you know, what you can do to address those. So again something that was sort of possible with the old repository style reporting but much more efficient and streamlined.

27:39
And easier to use now with this monitoring tool.

27:47
Another huge piece of this is also the ability to do deployments. This was another thing that in my opinion Tableau was severely missing if you had multiple servers like a Dev test production environment. There was no good way to you know, build something in Dev move with it through the test environment get it validated and then move to production it was, you know, either by a third-party tool go to Tableau.

28:14
My great open source software and configure that thing or the old-fashioned way, you know use Tableau desktop to download stuff from one and republished to the other just all my opinion not really good ways to do it. It should be possible to do it within the Tableau Software that you’re given and now it is and it’s actually really well done.

28:34
This is another installation file that you can get from Tableau, and it’s got its macro migration tool and essentially it allows you to kind of create a Plan and the good news is these plans will be reusable. So you essentially just pick you know, the source and Target source and destination I say hey, this is my Dev server. This is my cue a test server. I want to login with this account on each server and then you can go ahead and select the projects you want you can select, you know, all the projects individual projects. You can go into the projects and pick individual workbooks.

29:08
You can pick individual data sources one or more mix and match once you have all those there’s a whole bunch of transformation options you have so if you have a table in your Dev environment that’s called, you know, Dev underscore products and in your testimony, it’s called test underscore products. You can go ahead and replace that table’s schema name in the Transformations.

29:33
You can also modify custom SQL if you have extracts that are sort of in the project or the workbook that are you know tied to the Day that only corresponds to the dev environment. You can remove the extracts from the projects are work. Once you’ve selected as all sort of transformation options. There’s more that I list here, but it’s pretty well thought out and should address almost any type of issue or concern things that can be used to pain when you do migrations that to go back into once you publish and modify a bunch of things.

30:09
So it does all that stuff for you as part of this plan and then the publish options also, as a bunch of things as well. Do you want to reset the dashboard selections? Do you want to override newer? What books you want to copy the permissions which is a big one if you know want you put a bunch of time into securing and setting permissions on the project and you want it to be migrated over so that you don’t have to recopy or redo that you can check a box and say Hey, I want to copy these permissions. Even the extract refresh schedule is, you know, if you want to have that schedule applied to the Target destination environment.

30:44
Do that. So go ahead and you know, look at the options. It’s definitely a nice tool long-overdue really? Well thought I think it’s really well done and it’s going to make you know, if you say if you have multiple servers multiple environments and you need to move stuff. It’s going to make your life much simpler to do that now and again, these are reusable. So there’s a little bit of time set up to you know, initially get all the bells and whistles and the check boxes set up.

31:11
But then once you have it, you can go ahead and rerun these plans daily weekly monthly as often as you need to and it should almost be automated to a degree.

31:23
There are a couple other things about the deployments. I wanted to just note the first limitations. There are a couple things that are not capable or something. You can kind of tag as something you want to deploy those being in terms of data driver alerts if you’ve got custom views or if you sort of did some thumbnails on your visualizations to kind of show users what they look like before they run them those things are not currently things.

31:51
Deploy, but in my opinion that’s pretty minor not a huge deal. I’m just happy to have the tool so they may be able to add some of those things in future releases, but for probably got As for the first release or the beta version of this this migration tool, it’s pretty robust other really nice features about this tool are one the ability to consolidate sites. So we see a lot people with sites.

32:21
Ali they may have used that instead of projects and they’re fine and all but there’s also makes it very hard to move things between sites and sharing between sites which is sort of one of the things you create sites for is I don’t want anybody. It’s almost multi-tenancy. You know, I don’t want slight one to see anything I’d say to is seeing and not even be aware that there’s stuff out there. However, if you do need to consolidate stuff or you have you know, three sides and you want to move into one this to let you do that it’ll audit you can automate it and say I want to move everything from this site, you know, site one and decide.

32:51
Re and psi2 and psi3 and kind of consolidate those sites that that’s been something that’s been time-consuming or difficult to do you can now do that with the deployment tool and finally another huge feature is the ability to roll back.

33:06
So if you happen to, you know move something into production a new feature and there was just something unexpected that broke the dashboard or security was not set up correctly in the development version and it was a miss and for whatever reason you did he say, oh my God, we got to get this fixed immediately instead of having to go and you know fix it and republish it you can just actually there’s a rollback option to say. Hey, here’s the changes that went in. I want to roll this back and they can completely undo the last migration that you did. So it is my opinion a real Lifesaver that can save you, you know entire evening or weekend, you know, depending on when you do your appointments little things like that.

33:51
So those happen and instead of you know, working all weekend. You just click a button roll it back and get it fixed. And you’re in much better shape. So just ruin that they thought to put that in there and I guarantee if you’re in if you have to do migrations, it’s going to save your neck one of these days and then finally, you know, if you are a command-line guru and you like the script things there’s a whole set of command line features as well. So you can automate and script A lot of these things and have them run.

34:21
On schedules. So again, just they really took a lot of time and put a lot of thought into this tool and I’m very excited to have this in my tool belt now for Tableau server Administration another feature again, this this is just such a full-featured upgrade in my opinion and something that again it has sort of been long overdue is the ability to use an external repository.

34:47
So if you’re at a blow install, or if you have any idea about you know, how it actually Works on the server installs its own postgres database that it uses for internal usage. It’s a database just about Tableau information, you know, your users your projects the dashboards themselves data sources things like that. It’s all sort of stored in this repository. And you had no options really in the past as to if I want to do something else or if I wanted to kind of scale that database a bit. It was a bit limited. So now they have introduced the ability to use Amazon.

35:21
On relational database Services RDS and this with being an Amazon product that they provide an AWS. It allows you to have sort of capabilities that just weren’t possible with the old postgres database that’s things like scalability reliability High availability.

35:38
You can build an additional security just a nice features that you can wear just limited before so there are a couple gotchas on this one in the big one being you have to be on AWS so on-prem users if you’re using as your Google some other type of close to the cloud environment, unfortunately this as of today is not available outside of Amazon web services. It kind of makes sense that it’s an Amazon product you have to do that, but I was sort of hoping when I first heard about this that it would be available to anybody as long as you could access some type of a to Amazon RDS, but curly not the case. Obviously. This is a new feature. So it’s beyond 2019 that three or later you also.

36:21
So must have that that server management. Add on key activated. I showed that screen and the licensing earlier if you don’t have that you can’t use it. And then finally you’re going to be responsible for this new RDS database. So it’s up to you to set this up to manage it to tweak it to knit, you know all the features that we mentioned earlier about. You know, why you’d want to do this. You need to know how to do that. So if you want to be hands-off and you don’t really care and just want to use postgres it’s again.

36:52
Don’t really touch it. It’s sort of not no maintenance for the most part. But if you do if you had issues in the past and you want to speed things up and really want to optimize and do some additional things to it. You’re going to need to kind of get up to speed on RDS and know how to do how to work with that another nice feature. They did here and not too kind of screw. Anybody over is that it allows you to move from one to the other.

37:14
So obviously if you have been on Tableau before 2018 that three year running the local postgres database, so They created tools and the ability to move from that local to the Amazon instance. And however, you can also move from the Amazon RDS to local. So if you happen to be on Amazon today and you want to move on Perm or move to a different Cloud hosting provider, they’re not going to lock you in and sort of screw you over you can actually convert back over. So again, it was just digital thought that went into I feel like other vendors may have not done this but they did this so it’s very helpful.

37:51
Well, it’s going to save you, you know, the ability you don’t have to sign your life away to make this change. So again, just a pretty cool thing that they did here. Another feature is workload optimization. If you’ve been on other Performance Tuning Tableau webinars that we’ve done we’ve talked about, you know, spinning up additional nodes and sort of having dedicated helpers to offload some of the things that can bog down a tableau server.

38:21
So the big one that’s always sort of been the number one culprit when tuning Tableau server is the extract refreshes, you know, someone comes to me and says, hey we’re having really slow performance and Tableau. I’m going to look at your extract refreshes and I’d say eight to eight times out of 10.

38:39
It’s going to be there’s just long extracts running all day long duplicates and things that the database gets refreshed at midnight, but some of this refreshing the extract five times a day just things like that you could kind of offload and have dedicated backgrounders that you could free up the primary server for users who just want the bisque UL service to view their dashboards and visualization. So that’s still in play here. But now we can get down to an even smaller level of granularity. We can decide what background task actually go. Where do we want the server to do the extracts? We want to do subscriptions. You can have them do flows run out ask any type of thing that’s being an issue from the Tableau.

39:21
Man line, so it’s going to give us a whole bunch of flexibly that we didn’t have before it also obviously requires a molten environment, you know, you’re not going to be able to move things to another serve you only have one server. So obviously there and then you have you need to have these notes set up and then you can specify what goes where so what that means is here’s some of the options that you have you can sort of set an individual server in your environment.

39:51
To have only you know certain jobs you can you know deny or allow specific tasks. So for example, I could say This Server does everything all jobs or I could say this one just does flows or I don’t want any flows on your extract subscriptions extracts and extract refreshes and subscriptions, or I was like no extracts no subscriptions.

40:15
So you can really get you know down to the fine tuning if you’ve got multiple servers as to you know, what servers Do what and figure out, you know have a dedicated subscription server. If you’re sending out lots of scripture subscriptions or just have one that’s doing extracts and subscriptions. You can kind of mix and match and find you know, what’s the best way to do this? So again, this wasn’t possible, you know other tools going to have routing rules things like that you could do and you kind of got down this level of granularity was missing in Tableau.

40:44
Now, they kind of offered this feet this features and you can go ahead and just set these and kind of set up your own sort of flows and it worked traffic is going to flow in your environment and hopefully this can kind of help balance out some of those overworked servers and isolate things that were bogging down one of your workers.

41:08
And then finally as Mike mentioned earlier on we had this date on the calendar for a while and didn’t want to change the title, but it’s 2019 that for came out a little while ago.

41:20
Maybe some you got to go to the Tableau conference last week and see some of the new features but doesn’t actually have four just some of the new features that if I had done it for this that version I would have talked about was now again, I mentioned explain data, but they’ve already I made enhancements to that more data types more information coming from explain data if using the automated workflows, there’s now web hooks so you can kind of automate those which is another nice feature the ability to do extracts from the browser is a something again that’s been long overdue in my opinion as well. That was something that had to be done desktop.

41:55
They’re kind of making a good push here to make almost everything available now through the web and removing that limitation on Is it had to be done with the desktop tool as well as on table improvements, you know more table with more information that you can kind of bring in now that was limitation and limitations in the past and then from the conference than just some of the things they tease, you know, we don’t have the exact date yet. We’ll continue to update, you know via webinars on our blogs, but 2020 and beyond big features like being able to write back to the database from the Tableau prep tool Tableau prep.

42:33
Tableau prep is apparently supposed to be 100% browser-based in the future. So there shouldn’t be any quarter of them and no features missing that you can’t do in the tool that you can do in the browser. So again, I mentioned a big push to kind of have browser support and not be missing any functionality which I’m excited to see as well and other cool features like Dynamic parameters and visualization animations. There’s a whole long list. I can send it to you if you’re interested.

43:01
You can reach up a line but a very exciting 2020 For Tableau server and Tableau features. So keep a look out for future webinars and blog posts and will keep you updated on all those new exciting things.

43:17
It’s great stuff Todd. I think when they switch to the stick when they switch to just these monthly releases or these days Year. Quarter names. They took away all the marketing Splash that you could make with these things because it used to be if you had a big version like V10 or you know v11 or something like that. You’d know that they’d put a bunch of stuff into it, but 20 19.3 to me really feels like, you know, kind of a very major release across all these different fronts.

43:45
So thanks for Thanks for some great information there stick around for the Q&A get your questions into the question Pane and we will get to those in just a couple minutes as you noticed from the presentation Todd presented a wide array of substantial improvements in Tableau that increase its ease of use but also its complexity as you are able to now deployed a more users scale it out to the Enterprise monitor that performance and do things.

44:16
Things like have resource management and stuff like that which takes a dedicated server. So there’s a lot of complexity behind the scenes there. And that’s where we can really come to Bear to help you either on the front or the back end. So we have an entire team of Tableau experts that can fill in as needed in your organization to help you create very impactful dashboards. And as well as folks like Todd and folks in his practice that can really help you leverage and harness the new features of tableau.

44:46
Get you in getting you up to the latest versions migrating too smoothly to stable current versions. We can give you everything from high-level strategic guidance to you know, kind of role in our sleeves up boots-on-the-ground assistance and everything in between. So reach out to us if you have any need for that, they’re so again stick around for the QA just give you a couple of quick slides on ventures and who we are we are an analytics consulting firm providing a full spectrum of Consulting.

45:16
And training our clients know us for providing Clarity from the chaos of complex business requirements Myriad and ever-expanding disparate data sources and constantly moving targets and changing regulatory environments. We’ve made a name for ourselves because of our strength at Bridging the Gap between it and business users and organizations. We deliver solutions that give you access to Reliable analysis ready data across your organization.

45:40
So you can quickly and easily get answers the point of impact in the form of the decisions you make and the actions you take as I mentioned, we are a full-spectrum consulting company for business intelligence. Our Consultants are leading experts in the field of analytics with years of pragmatic hands-on real-world experience. In addition. We are migration specialist helping you get from various tools whether its Cognos to Tableau Cognos to Power BI Tableau of Power BI wherever you need to go based upon your business requirements and the needs of your organization.

46:17
Nation we can help you get there. In fact, we’re so confident in our team in our methodology that we back our projects with an industry unique 100% money back guarantee add to that that we’ve been doing it for a while going on two decades here over 1,300 clients ranging from The Fortune 500 down to the mid-market.

46:35
I’m sure you’ll recognize many of the names here on our NASCAR Slide with 2,500 plus successful projects under our belt ranging from sales and marketing to finance Human Resources it And other lines of business we’ve answered a lot of business analytics questions. So the next time you have an analytics project or require training in those areas, please pick up the phone or reach out to us and let us bring our expertise to bear for you now some great additional free resources here. If you want to continue on to the next slide our upcoming events. If you head over to senturus.com events, we are offering a we’re doing a webinar on Power BI Dax.

47:17
So we’ll talk about it do an overview and talk about the mechanics of Dax otherwise known as data analysis expressions. And that’s on December 5th at the same time. You’re sitting at Pier today. Same thing for December 12th will be doing Tableau dashboards from default to dazzling our Tableau expert Monica. Van Loon will be presenting on how to build better looking charts using some custom formatting versus just straight out of the box stuff. So join us for that. We also have if you head over to the senturus.com sin.

47:47
Interest – resources page another page you should definitely bookmark and visit frequently.

47:51
It has all of our upcoming events like the ones I just mentioned in addition to the resource Library where again, you can find our past webinars including this one along with the recording the slide deck and the question log and our fabulous blog that has all nice bite-sized easily consumable information about what’s top-of-mind its interests and the latest and greatest in the industry, and I’d be remiss if I didn’t talk about Awesome training if you head over to Senturus accomplished training, we offer a full spectrum of training for Cognos Tableau and Power BI whether its corporate training across all these different modalities for a large organization all the way to self-paced learning and mentoring to address specific business problems and instructor-led online courses led by industry-leading world class instructors along those lines.

48:44
We have our very special Annual cyber sale where we’re offering 50% off of our instructor LED training that kicks off on December 2nd and ends on Friday December 6th. So look for that over at senturus.com training. It’s a great deal once a year. And then last thing about training. We have some great new self-paced Tableau training. We have a complete array of Tableau courses everything from fundamentals all the way through to expert Tableau development again led by fabulous.

49:17
Class instructors and we have a great new all access pass for a full year for only 499 dollars. I’d add to that. We also have a multi-platform pass where you can get access to all of our Cognos courses and all of our Tableau courses for a nice little price of $6.99 for all of those across both platforms. So if your organization is bimodal, that’s a great way to go.

49:40
All right, and with that we will come back here to the questions now, Todd I tried To hopefully had a chance to look through those there were I flagged the first couple there and it’s a good question about the data management tool and does that require the server management add-on?

50:01
I believe it does I had to check the licensing on that. I don’t know if anybody on the line knows for sure. I think if any of our panelists know off the top of your head we can chime in. Otherwise we can we can put it in the question log and publish it along with the rest of the answers to the questions.

50:26
So then there’s a question about is a dedicated server recommended to run the server management add-on in a two node environment. Yes, according to according to what they’re saying. It should have its own dedicated server. It’s going to need its own postgres database and it’s going to be feeding from the agent that’s going to be running so I don’t believe I guess.

50:55
Possible but recommended to have its own dedicated server for the server management.

51:02
Resource monitoring it’s what you’re asking night.

51:07
Yeah, I think that’s I think that’s what they’re asking and I think it’s kind of a function of I mean generally your you’re mad at your saying that that’s recommended and it sort of depends on what your to not environment looks like in terms of its it’s the load on it and things like that, right? Yeah assuming it’s you know, the standard, you know, either a clone of the additional server that just load balanced or you had you know dedicated backgrounders on the second serve.

51:32
I don’t know what you’re You know your environment looks like but some of you want to monitor both of those you probably want to have agents on the two and then a separate dedicated master that sort of collecting that information and providing those resource monitoring capabilities.

51:51
Also, we got a bunch of questions about the licensing on the migration tool. I will definitely confirm hundred percent on that in the QA.

52:01
Unless someone on the call knows for sure, but I believe it does. But again, I’ll confirm in the queue and then notes that we post to the slide deck in the video sounds good and he only did mention one of these questions is about the upcoming noodle feature that they mentioned at the conference. And that was that was that’s a roadmap item.

52:23
And that’s really a’s a huge one in my eyes because that’s around allowing the handling multiple different grains of data so you can handle multi fact joins and it really? I think we’ll open up just a so many more use cases that you’ll be able to model in Tableau. I’ve always said that the modeling capabilities and Tableau as it stands in your data in the data source painter are somewhat limited. So it forced you to model your data up further Upstream. So with the noodles capability and the ability to for Tableau to identify and let you handle those types of complex multi fact scenarios and data at different grains.

53:04
Ends, it really opens up a world of possibilities. So that’ll be an exciting feature, but that we’re expecting to see in 2020.

53:12
Yeah, hoping we don’t the wait till 2020 that three to get that. So yeah, I have a hunch because it’s non-trivial. I think there’s a reason it’s hasn’t been in the product to this point. So although I kind of thought when they’ve created Federation and handled Federation and V10 that that would be closed on its heels. So and it’s been a few years since that came out.

53:34
Great. Well, I’m trying to think if there’s any other questions folks. Please get him into the question long. I think we’ve depleted the questions that are here in the log so far so Todd, I guess your presentation was so thorough and clear that answered all of their questions erase all doubt erase all doubt. They are a hundred percent clear and one of our panelists had just said that the migration tool does indeed require the server management add-on. So I’ll put that in the QA follow-up that will attach.

54:04
For anybody who may have signed off early, but if you’re still on the line, yes, you do need to have that in order to use the migration tool and the server management add-on. I believe is it’s user-based and it’s kind of an all-or-nothing right? You have to I think every all the users have to get it you can sort of pick it out for a chunk of user. So definitely adds a lot of value but is something you want to consider from a from a cost-benefit perspective.

54:34
Great. Well if there are no further questions than first of all, thank Todd for putting on another great presentation. There’s a lot of great features here. Hopefully you all learned a lot today and thank all of you for joining us today on this webinar. We hope to see you on a on an event coming up soon. And again, hopefully reach out to us for any of your analytics consulting and or training or software needs. We look forward to seeing you on an upcoming event. Thanks and have a great rest of your day by now.

Connect with Senturus

Sign up to be notified about our upcoming events

Back to top