Use Tableau dashboards and write data to a database

That’s right! Enter any value in your Parameter field(s) and have data written to a database (choose your flavor).  Keep history, study filter trends (and timing), leverage it as a form-enabler and/or data analytics tool. Now Tableau *can* take user input and write that input to a database.

I’ll let that soak in…Filter/Parameter values being written back to a database.

tableau-database - Page 1

Okay, here’s how to do it.

Step 1

Make a Dashboard and toss some Parameters on it (can be any data type). NOTE: you can also use filters because that data is also logged. But, parameters are fun because so many people think they aren’t dynamic.  It’s time to prove they are, in fact, very dynamic.

Step 2

Use the logs.

Every event performed on a dashboard/workbook is logged. It’s just a matter of finding the right key/value pair, parsing, and then adding to a database. For example, if I have a free-form parameter value and I enter, ‘2,000,000’, it is logged along with the parameter number (note this is different than your parameter caption) as well as user, session, request, and a bunch of other stuff.

This means you can keep a history of who changed the parameter, the old value, the new value, and a whole bunch of other stuff.

Step 3

Create some tables in your database. (NOTE: you only do this 1x)

This is simple if you just want to keep a log of the changing parameter values. If, on the other hand, you want to keep performance metrics, track which workbook/sheets are being used, then I’d recommend you also parse and create analysis on the Apache logs. I’ve talked about this a lot here, here, here and here.

Step 4

Parse the logs and add to the tables.

Simple since the main log is in json format. Use some PowerShell and you can do this in a simple function/module: 1 for pulling the log data and 1 for adding to a database.

If you have a log analytics tool in place, this becomes insanely easy since all your logs are centralized (they are, right?!). If not, you can just parse the log file in the default directory.

Step 5

Enjoy!

Use Cases

  • As a light-CRM
  • Data ‘picker’ tool for template dashboards (for example: choose between line/bar chart, etc and data gets written to database and picked up by another script that makes the workbook for you, based on those choices).
  • Replacement for embedded forms
  • Full cycle analytics (from data entry to data analysis, quantified self stuff)
  • Triggers (for example: User A picks Parameter B and, a csv and/or png file is delivered)
  • Usage data about dashboards (for example: if you create 15 parameters and users only ever use 2, then you can remove the unnecessary stuff).
  • Permission approval form (for example: User A request permission to Dashboard B)
  • Subscription forms
  • Form for filtered queries (for example: use fields to help analysts who might not know SQL to create custom SQL based on set fields/values)
  • And more!

Adding Tableau Server users

If you’re leveraging local authentication for Tableau Server, you might be wondering if there were an automated / ad hoc way to remove/update users to your server and its Sites (where appropriate). Sure, you can manually do it. But who wants to do that? No one. If you have a lot of activity, these one-offs are bound to add up and eat away both time and security.

Here’s a quick PowerShell function that will do that for you. This is designed to update users and assumes you already have an automated way of provisioning your new and removing your old (please say you do). Again, by ‘update’, I mean, remove the ability for a users to just go back to the site and select ‘Forgot your password’ after they have been terminated. That would be bad, especially if they have a valid email…

Get the code here.

Let me know if there are questions!

-Mike

Preserve Order (CSV Order in Tableau)

Ever created a wonderful Tableau dashboard with the added ‘Export to CSV’ functionality? We all have. Click the super-sleek Excel icon and, viola, the download begins. Send the file, walk away and think: ‘my, was that cool.’

ts-csv-twb
Nice and ordered.

But wait. You get an email complaining about column order. For some reason, the columns you’ve added, perfectly, are all messed up. In fact, some would say they’re in alphabetical order. What the?!

ts-csv-file
Oh come on.

Anyway, here’s an easy PowerShell function that will fix that and, send the email with the columns in the correct order.

 

A.I.M. for Success with Logentries and Tableau

#Data16 may be over and the Server Admin session may have ended but don’t let the fun stop there. Continuing with the recommendation and urgency of making sure you monitor your Tableau/Analytics infrastructure, Logentries and I have teamed up on a Whitepaper regarding all things Alerting, Integrating and Monitoring.

You’ll find a very through analysis of the *why* it’s important to have a strategy in place as well as tips/tricks and recommendations for further reading. What’s more, you’ll find out how easy it is to get a variety of log data back into Tableau for deeper analysis.

So, get the Whitepaper and spend Thanksgiving implementing it. Just kidding. Take a break for Thanksgiving and then do this 🙂

Best,

Mike

 

Hello Analytics Engineering

I’ve talked a lot about Analytics and how it must be re-imagined for today’s often frantic pace of innovation in both technology and theory. What’s typically missing is the other side of the Analytics coin: Engineering. Most people tend to forget that before one can either explore or view some sort of analytics, there is a lot of movement that must go into preparing that data. This doesn’t even include ‘Self Service’ Analytics which is another story in and of itself!

You often hear: “Well, I just want all the data, I don’t care how hard it is.” Which translates to: “I don’t know what I want but tons of potentially useless data might get me an answer.”

Enter Analytics Engineers.

The data world is modular and in constant flux. One must be able to adapt, move data and present a tool in the most efficient and scalable way possible.

Enter Analytics Ops.

In order to do that, there has to be a ‘glue’ that can hold all the pieces of the Analytics/Data world together. That glue is #PowerShell and I’m thrilled to say that I’ve been selected as a speaker at the Global Devops #PowerShell 2017 summit. Twice!

I’ll be speaking about a different use for PowerShell and one that I’m pretty excited to share with the world. Basically, PowerShell and Data go very well together.

So, here are the two sessions I’ll be speaking at:

Session 1

Time:

Tuesday at 3pm PST:

Title:

Operation Float the Bloat: Use PowerShell for Tableau Server QA and Alerting

Abstract:

Business Intelligence and, generally speaking, the data landscape is a mixture of moving parts, sometimes so many that it’s hard to keep track of what process does what. An Enterprise BI platform is just that, a platform. Within that, we’re dealing with data from APIs, databases (relational and non-relational), text files and many more data variables. Missing from the Analytics infrastructure, however, is a proper log analytics and QA strategy. How do you know your platform is performing as it should be? How do you know data is secure? How do you streamline analytics so users are left with correct and fast data? How do you ensure users are publishing quality content at scale? In this session, we’ll show you how to do all that and more with Tableau Server and PowerShell by focusing on three pillars: Alert, Integrate and Monitor. We’ll use PowerShell and custom functions to ‘Garbage Collect’ old content and archive on Amazon S3, we’ll leverage log files from both the BI Platform and our servers (Windows and Linux) to monitor and maintain the condition and health of the analytics infrastructure. We’ll use PowerShell to easily convert the analytics data (worksheets and views) into a medium upon which we can change anything. We’ll also use PowerShell to dynamically create content in Tableau based on a configuration file. Oh, and, all of this is automated because PowerShell can make it so.

Session 2

Time:

Wednesday at 10am PST

Title:

Using PowerShell for Analytics Engineering (or why PowerShell is the glue for data, big or small).

Abstract:

While there are numerous and exceptional benefits to using PowerShell as an IT Pro and Developer, the hidden gem is its capability for Business Intelligence and Analytics Engineering. In simple terms, it is the lynch-pin of data and analytics. In this session, I’ll demonstrate how PowerShell is used to load, query and aggregate data for Business Intelligence platforms, specifically Tableau Server. What’s more, we’ll automate everything from AWS EC2 instance provisioning, report generation, report delivery, and Log Analytics. Want integration too?! We’ll show you how to reach into pretty much anything via APIs using tools like Chocolatey, cURL, WMI , Git, and Remoting. Oh, one more thing. We’ll use PowerShell classes along with scheduled jobs to make platform administration simple and stable. In the end, we’ll have modules, functions and custom scripts. Adding PowerShell to your data toolbox will provide enormous benefits, not the least of which is adaptability in the rapidly changing data landscape.

 

 

Hope to see you there!

-Mike