Tableau Permissions: Automated

Here’s the slide deck from the Tableau Admin group meeting. Please let me know if there are questions. A few things to note:

1.) There is a complete ‘User’ module that does the following:

  • add/remove user via local auth
  • add/remove user via AD
  • add/remove groups

2.) These are meant to be used together. For example, you’d first add a group, then run the Set-TsPermission function. Oh, and, it also removes the ‘All Users’ group permissions.

And, again, part of permissions is the monitoring aspect: if you’re not aware of what’s happening, it’s very hard to adjust and enable permissions in a way that is effective.

 

Use Tableau dashboards and write data to a database

That’s right! Enter any value in your Parameter field(s) and have data written to a database (choose your flavor).  Keep history, study filter trends (and timing), leverage it as a form-enabler and/or data analytics tool. Now Tableau *can* take user input and write that input to a database.

I’ll let that soak in…Filter/Parameter values being written back to a database.

tableau-database - Page 1

Okay, here’s how to do it.

Step 1

Make a Dashboard and toss some Parameters on it (can be any data type). NOTE: you can also use filters because that data is also logged. But, parameters are fun because so many people think they aren’t dynamic.  It’s time to prove they are, in fact, very dynamic.

Step 2

Use the logs.

Every event performed on a dashboard/workbook is logged. It’s just a matter of finding the right key/value pair, parsing, and then adding to a database. For example, if I have a free-form parameter value and I enter, ‘2,000,000’, it is logged along with the parameter number (note this is different than your parameter caption) as well as user, session, request, and a bunch of other stuff.

This means you can keep a history of who changed the parameter, the old value, the new value, and a whole bunch of other stuff.

Step 3

Create some tables in your database. (NOTE: you only do this 1x)

This is simple if you just want to keep a log of the changing parameter values. If, on the other hand, you want to keep performance metrics, track which workbook/sheets are being used, then I’d recommend you also parse and create analysis on the Apache logs. I’ve talked about this a lot here, here, here and here.

Step 4

Parse the logs and add to the tables.

Simple since the main log is in json format. Use some PowerShell and you can do this in a simple function/module: 1 for pulling the log data and 1 for adding to a database.

If you have a log analytics tool in place, this becomes insanely easy since all your logs are centralized (they are, right?!). If not, you can just parse the log file in the default directory.

Step 5

Enjoy!

Use Cases

  • As a light-CRM
  • Data ‘picker’ tool for template dashboards (for example: choose between line/bar chart, etc and data gets written to database and picked up by another script that makes the workbook for you, based on those choices).
  • Replacement for embedded forms
  • Full cycle analytics (from data entry to data analysis, quantified self stuff)
  • Triggers (for example: User A picks Parameter B and, a csv and/or png file is delivered)
  • Usage data about dashboards (for example: if you create 15 parameters and users only ever use 2, then you can remove the unnecessary stuff).
  • Permission approval form (for example: User A request permission to Dashboard B)
  • Subscription forms
  • Form for filtered queries (for example: use fields to help analysts who might not know SQL to create custom SQL based on set fields/values)
  • And more!

What is Analytics Engineering?!

“It is easier to ship recipes than cakes and biscuits.” -John Maynard Keynes

ae-1

As an Analytics Engineer, you may hear this a lot: ‘Wait, so you just make dashboards and reports, right?’ It’s common but often based off a deep misunderstanding about the analytics process. Let me attempt to clarify.

As an Analytics Engineer, you are responsible for moving the analytics and data processes into the layer above the data warehouse. Essentially, your task is to create something from the basic set of data and shape it into something both actionable and useful.  Naturally, this task lends itself to a broad and narrow set of skills. And it is probably why people are so quick to assume you just ‘make reports and dashboards.’ They see the result of the work, not the process that created that report and dashboard. They did not see you toil with a dataset too wide and at the wrong granularity. They did not see you automate the ETL of said table into an aggregated table with security, alerting and logging baked into the process. And, after all that, they did not see you worry about font, color, and styling of the dashboard. Because the effects of a terrible design are catastrophic for your message.

An Analytics Engineer must be, primarily, a technologist and thinker. With the sheer volume of options available (throughout the Analytics and Data layers), context and task shifting are an understatement. It’s common to spend most of the day at the data (SQL) layer, while spending a smaller portion of time creating and implementing a dashboard tool. And it’s that balance of time which allows for functional dashboards.

One vital piece that is worth mentioning (and possibility elaborating on) is how this field is not solely locked into one technology or stack (for example: LAMP stack, cloud vendors or analytics tools); rather, this field should be able to use all the tools and should leverage its own, more abstract, stack unrelated to a technology. So, today it might be MySQL but tomorrow it could be Redshift or the like. Therein rests the challenge: being able to adapt to the tool at hand without missing a beat.

 

ae-2

Adding Tableau Server users

If you’re leveraging local authentication for Tableau Server, you might be wondering if there were an automated / ad hoc way to remove/update users to your server and its Sites (where appropriate). Sure, you can manually do it. But who wants to do that? No one. If you have a lot of activity, these one-offs are bound to add up and eat away both time and security.

Here’s a quick PowerShell function that will do that for you. This is designed to update users and assumes you already have an automated way of provisioning your new and removing your old (please say you do). Again, by ‘update’, I mean, remove the ability for a users to just go back to the site and select ‘Forgot your password’ after they have been terminated. That would be bad, especially if they have a valid email…

Get the code here.

Let me know if there are questions!

-Mike

Farmington Tableau User Group

We’re excited to bring Tableau to Farmington!

Our first meeting is Thursday, February 23 @ 3:30PM MST. Please attend if you’re able. We’re planning on gearing these meetings around the applied, real-world use of Tableau so they should be a blast.

Here are the details:

https://www.eventbrite.com/e/farmington-tableau-user-group-tickets-31849873830

See you there!

-Mike

Analytics as a Service: Tableau Rule 2 – Data

aaas_data_2

 

Forget about perfection

Your data (information) is a set of free-flowing, dynamic instructions about how business (or whatever) is understood. It will never be perfect. In fact, you don’t want it to be; if it’s perfect (which, again, is impossible) there’s no room for improvement or self-reflection. What’s more, it will lack creative impulse: you don’t think freely if something is ‘perfect’ and done.

Adapt with data

Data is fluid

The goal should be to bring the certification/governance process *to* the data. If you must wait, collect, meet, agree and on and on, there is a critical piece missing: data should be certified from what it produces (or its many derivations). How does this work? How can you certify a csv file? Simple: Alert, Integrate and Monitor your Analytics Infrastructure.

Essentially, if nothing is created from the data, why would there be a need to certify it? Once something is created, then you relentlessly certify, in flight, what is being produced. It’s a sort of fact-checking, data-alerting mechanism that’s completely possible with the right framework. Which leads to the next point…

Collect data about patterns of usage

If you’re not analyzing usage patterns, you’re missing valuable data. With all analytics, there are reasons for why (1) a specific set of data is selected and (2) what the user is attempting to do with the data. You can easily keep this metadata in AWS S3 (with a good lifecycle policy) or store for potential later use somewhere else. The point is that if you aren’t understanding *why* then you are only seeing one side of the coin.

Keep everything and then figure out what to do with it and then how to certify it.

More data

Analytics as a Service: Tableau Rule 1 – Infrastructure

 

ts-cloud-element-1

Leverage the cloud

Elasticity

Don’t be constrained or afraid to combine pieces of cloud technologies to serve the Analytics structure.

Become durable / resilient

Just in case

Even though there are very high monthly uptime percentages, just be prepared for something to break. If you do that, you’ll have even *more* creative freedom (crazy, huh?).

Choose to: (1) scale laterally or (2) scale vertically

This is all about re-framing the question around Projects vs Sites.

Why you have Sites over Projects or Projects over Sites? And that can’t be the only choice, right? (Hint: it’s not the only choice)

I’ve seen benefits to both but the extra work involved with Sites make scaling laterally (Sites) much more difficult than vertically (Projects), not to mention the challenges of stepping into the compliance realm.

Eliminate waste

Storage

Remove all the pieces from your base install that can be done elsewhere (eg: collect the ‘garbage’ but store on AWS S3 with a good lifecycle policy). That way, your Analytics infra is light and fast.

I challenge you to think *bigger* with Tableau.  How can you provide more fluid access to insight than anything else?

-Mike

 

2017: Analytics or Reporting

I’m calling it: 2017 will not confuse Analytics with Reporting

We’ve got too much technology, tooling, and components to mix the 2 realms (hint: they’ve never been related…the ‘self-service’ myth hasn’t really separated them quite yet ).

Analytics has depth and is fluid. Reporting is rigid and superficial.

Here is a small example of what I mean. Your #Fitbit is more than a report. Think about that and shake your, er, data-maker 🙂

Look for more on this and other tech bits this year.

Happy New Year!

-Mike

 

Preserve Order (CSV Order in Tableau)

Ever created a wonderful Tableau dashboard with the added ‘Export to CSV’ functionality? We all have. Click the super-sleek Excel icon and, viola, the download begins. Send the file, walk away and think: ‘my, was that cool.’

ts-csv-twb
Nice and ordered.

But wait. You get an email complaining about column order. For some reason, the columns you’ve added, perfectly, are all messed up. In fact, some would say they’re in alphabetical order. What the?!

ts-csv-file
Oh come on.

Anyway, here’s an easy PowerShell function that will fix that and, send the email with the columns in the correct order.