[sws_grey_box box_size=”690″]SUMMARY – The last few days I’ve been immersed at the annual Tableau Data Conference. It’s been a real eyeopener. [/sws_grey_box]
Regular readers will know that I have been following Tableau for a while, largely because I am increasingly hearing good things from customers which in turn is reflected in rocket speed accelerating revenue. For those unfamiliar with the company, they are in what some call the data visualization space. I prefer to think of them as representative of the 21st century business intelligence and analysis space. If that sounds shocking then read on.
Solving 21st century problems
Tableau users are solving 21st century problems. Examples: I’ve already referred to the Seattle Sounders looking at aligning player performance to fitness regimes and the data needed to get that done. You won’t find that data in any conventional application software. At Wells Fargo, their customer groups are looking at what some call ‘low value’ weblogs in order to understand basic customer activity patterns. Again, you won’t find that data inside a conventional enterprise application. And at Seattle Children’s Hospital, they have built a high speed database that ingests the 200-300 daily measurements from some 25o patients to start improving patient care and times to treatment. There is no application source for that data.
Tableau along with data cleansing and transformation tools like Alteryx allows people to easily visualize that data in new and interesting ways. And it can be done very quickly. Let me explain how.
I took a 15,0000 row database that covers how the GCloud has performed since 2012. There are not many columns to it and UK Gov has already built a set of graphs showing selected versions of that data. The representations are OK. They’re moderately informative but it is hard to get a good overall picture and you can’t timeslice the data. Neither can you figure out who are the biggest spenders or where they are spending. I took that dataset and gave it to a Tableau product marketer and asked him to show me how to build a dashboard that represents that data in a reasonably useful form.
Here is the result.
What you see here is only a screenshot but I have published the full result in an interactive dashboard at this URL.
What you see above took 45 minutes. It hasn’t been sanitized in any way. Very little has been done to style the dashboard and you can argue that it misses some data like who the money is going to. But what you can instantly see is that the Home Office is the biggest spender and that ‘specialist cloud services’ is where the vast majority of the money is going across all buyers. You can also see that relatively speaking, SME suppliers are enjoying just about the same amount of spend coming their way as their larger brethren. GCloud fans will be pleased to see that 😉
If you head over to the URL, you will be able to slide the timeline to find different spend figures at different points on the continuum. You can also click on any of the blocks and bars at the top to discover who is spending what and where.
I must stress this is only a first cut and I was coming to this completely fresh with no understanding of the product whatsoever other than a few quick views on some videos. Even so, I believe that what you see here is more useful than that which is already available in the public domain, although that is not to say the government has done a poor job. They haven’t.
This tool opens up all sorts of possibilities for us at diginomica and for anyone who needs to heft data into a form that both makes sense and is actionable. The fact I can publish out to a shareable URL is a huge deal because now I can freely distribute information to whomever needs it, into the public domain or I can keep it private.
From a training perspective, I expect to put in a good 30-40 hours to ensure I fully understand the basics and a bit more besides. To me, the benefits far outweigh the costs. You will see us making much more use of data in our work going forward. So why am I so excited?
I’ve been attending events for more than 20 years. In all that time I have never been an attendee with a partial agenda that includes reviewing a piece of software as a potential buyer. There hasn’t been the need. But our business is changing rapidly and we have to respond the same as any other.
Our business critical data doesn’t live inside neat packages but inside public records, weblogs, clickstreams, Google, Facebook, Twitter and many other places. It is incredibly messy. But inside those data lie the insights that help us shape content, help us deliver better insights to those who read our stuff and which help our partners build better content.
In short and used correctly, Tableau will help make us smarter with data that today is hard for us to parse.
No brainer pricing
For us, Tableau is a no-brainer purchase. For the first time, I am seeing a solution that ANY business can afford in order to get started. That fits very well with the company’s stated objective of ‘land and expand’ where, as use cases increase, you buy more licences. In large companies, Tableau expects to do $100,000 deals with increasing regularity. Us? Not anytime soon.
Back in the day, as in 1997, there is no way you could do that. Solutions like Cognos, BusinessObjects, Hyperion and so on were well beyond the scope of the small business. Even today, enterprise grade BI tools are expensive although price points are coming down. Even if you invest in the tools mentioned it is questionable whether they will help you solve 21st century problems. That’s because the data you really need doesn’t live in any system of record and even though the tools are advanced, they come with the baggage of being built for a different time and different purpose.
Is Tableau a cure-all. No. If you have heavy computational requirements then it is not the right solution because Tableau doesn’t have those capabilities. In those cases, you’d be better off looking at something like an Anaplan, Adaptive Insights or something similar. Maybe even Excel with its ‘free’ Power add-ins. In the latter case, make sure you’ve got someone on hand who can sense test your logic. Excel doesn’t do that for you.
Story telling with data
My final word on Tableau from this conference is about story telling. When I saw the agenda I was surprised to see Neil deGrasse Tyson and Michael Lewis as keynoters. On reflection it makes perfect sense. During his keynote, Christian Chabot, Tableau CEO repeatedly talked about the notion of data as the jump off point for story telling.
As I met with customers, they consistently said that their biggest challenge is in communicating in the language that their users understand. That’s all about storytelling. When Tyson and Lewis were on stage each gave their version of the story telling idea; Tyson from the scientific standpoint, and Lewis in the context of Moneyball. To loosely quote Lewis: explaining data is hard but if you can find a way to do so that your mother would understand then you’ve probably got it right.
Tableau helps go a long way towards doing that with the added benefit of allowing users to explore the data for themselves. It is that interactive quality that makes it so much more useful and accessible as a discussion and decision making tool.
Can you tell? I love it.
Disclosure: Tableau funded most of my travel and expenses for the conference and have comp’d me a single user licence of Tableau Desktop for the next year. We will likely licence further Tableau solutions.