Offline Admin Views

Ok, you have a great set of admin views for your Tableau Server(s), but how useful are they if your server has gone down and all of your monitoring or log analysis has gone down with it? Do you go old school and go through the logs manually or do you use your offline admin views?

This post, although brief, is based on a question I asked myself recently; do I have admin view redundancy?

You and your team have worked hard building your admin views and like my team have published a number of data sources to the server(s) to reuse on multiple dashboards, etc…  as any conscientious Tableau publisher should! However, by only having your dashboards and data sources available on the server – especially when you have included component or log analysis for the purpose of problem solving, you may have inadvertently created a dependency. You are as reliant on the service as your users.

The first part of this answer is simple; make sure you save a copy of your admin views on your network or your local computer (a network share is my preference as multiple team members can access it). It really is very easy to forget to save a local copy somewhere where everyone can access it.

Your data is a different story. If you’re using Postgres, you’re going to lose it unless you have a mirrored version of the db located off the server, your extracted datasources are likely to be out of date in your backups too. In the event your actual server(s) is/are inaccessible for a period of time, your logs (even OS logs) are going to be unavailable also (cringe). BUT – if you’re using Splunk and have a nice little instance collating your logs, you’re in luck.

offline admin views

So… lets recap; your servers are down and you have a copy of your admin views available on your network. Your live Splunk data sources in your admin views are actually still working fine, allowing a fully searchable set of views giving you performance metrics, errors from the logs and even a list of events from the OS, right up to the last second. How great is that?! No guessing, no assumptions on what could had gone wrong… It’s all there, right in front of you.

Of course, you should always have third-party applications and servers such as SCOM monitoring and Netcool doing their bits, but having this admin view redundancy in place gave me a feeling of relief and security – especially as a major part of my problem solving now revolves around the analysis and output from these views.

Overview of what Splunk gives you out of the box for your admin views:

  • Log collation – all of the Tableau Server logs in one place and one format.
  • Perfmon data – key perfmon data such as memory, CPU, disk and network usage.
  • Windows Events – system, application and security logs from the OS giving you everything that happened outside of the application.

With this data alone, I recon, anyone can figure out what happened to their servers, even a non technical user can look at them if they are portrayed well in a few views and pick out the pertinent errors and problems.

I would be interested to hear about your monitoring and admin view redundancy. Do you use admin views to carry out these type of debugging tasks? Feel free to message me via Twitter or something to let me know.

I hope this wasn’t too boring and pointless and you can sleep easy tonight knowing that when that call comes you have the data to hand.

Have a great day!

Event History Audit

You’re sitting at your desk and you get an angry phone call: “Someone has deleted my workbook! Who was it?!”

A call most of us have had at some point in our Tableau career, I am sure.

Lets go one worse, a call from the legal department: “We need to see what ‘Joe Bloggs’ has been doing on Tableau Server – Now!” – It has now become a nightmare and it may be time sensitive as well.

Although these seem like a straight forward questions, it is not always easy to answer them unless you have built the functionality to do so. This is where your custom admin views come into their own with Event History.

Using Postgres we can interrogate the internal Tableau database to answer a number of questions. These questions are generally; Who, What & When – but what particular events can we track?

Here are some common events that you can monitor:

Action Type Event Type
Access Logout
Login
Download Workbook
Download Data Source
Access View
Access Data Source
Create Create Workbook Task
Create System User
Create Site User
Create Schedule
Create Project
Create Group
Create Data Source Task
Add Comment
Delete Delete Workbook Task
Delete Workbook
Delete View
Delete System User
Delete Site User
Delete Schedule
Delete Project
Delete Group
Delete Data Source Task
Delete Data Source
Delete Comment
Publish Publish Workbook
Publish View
Publish Data Source
Send E-Mail Send Subscription E-Mail For Workbook
Send Subscription E-Mail For View
Update Update System User Full Name
Update System User Email
Update Site
Update Schedule
Update Project
Update Data Source Task
Update Data Source
Update Comment
Replace Data Source Extract
Refresh Workbook Extract
Refresh Data Source Extract
Move Workbook To
Move Workbook Task to Schedule
Move Workbook Task from Schedule
Move Workbook From
Move Datasource To
Move Datasource From
Move Data Source Task to Schedule
Move Data Source Task from Schedule
Increment Workbook Extract
Increment Data Source Extract
Enable Schedule
Disable Schedule
Change Workbook Ownership From
Change Workbook Ownership To
Change Datasource Ownership To
Change Datasource Ownership From
Append to Data Source Extract

Something to consider here is the ‘login’ and ‘logout’ events will not be accurate when you have single sign-on, etc.. You can use bootstrap session information to do this accurately.

You can also monitor other things with API calls, etc.. but for now lets focus on these common questions.

We are going to work on the premise that you already have some working knowledge of accessing your Postgres database, so I am not going to cover opening that up in this blog post.

When you’re connected to Postgres, you will want to connect to the following tables using these joins:

event histrory

You will see I have included the joins for the tables. In the bottom left box it shows you an example of the joins required to obtain the relative event information. It is good to note the relationship between ‘Actor’ and ‘Target’. These refer to the person who is carrying out the event (Actor) and the person it affects (Target). Target will only be for events like creating users and deleting users, not the owner of the content.

You can join the main views to the ‘hist_’ tables to this to get extra information on workbooks, etc.. but this will increase the volume of data, so I do not bring that back for my dashboard to keep some sort of good performance.

You can also pick up failing extracts from this if use bring back the “Details” column from the Historical_Events table. The “Details” column is not always populated but is important, so don’t be tempted to leave it out due to it being a wide column.

Once you have this as a data source you can then create a dashboard that looks at it. This is down to your particular requirements, but I have a list of events, a timeline and a dreaded crosstab list (for event details – and yes it is important so I control it by filter actions). Filters are important here as well. I have a free text search parameter that flicks between names, details, etc..  and filters for the Event Timestamp, Event Type and Action Type. Filtering will allow you to quickly search for people and content.

Example part of an Event History Dashboard:

event histrory Example

That is all I have time for at the moment. I hope this goes some way to answer these questions for you. I will do some more posts on this moving forward as time allows. In the meantime I will try to answer your questions if you have any. A special shout out to the CAP admins who requested this info.

Thanks!

Jake

Splunk your Tableau

If you are a Tableau Server admin then I am sure you know what what I mean by the “great log search”. The process by which you go wading through the millions of folders, files, data types and rows to identify that error and to take appropriate actions on it.

Wouldn’t it be great if you only had to go to one place and type a username, IP address or other keyword to identify that error? Better yet, what about having a dashboard that shows you what errors have happened when and being able to actually use all of the log data for some proper analysis? – that’s not possible right? … well actually it is and it is easier to set up than you think. The answer is Splunk.

What is Splunk?

  Splunk is actually another analysis tool, however the purpose of this post is not to analyse its visualisation capabilities, but to show that leveraging its indexing and database engine together with Tableau it is a formidable admin tool. 

(Online example)

Once you have indexed your logs Splunk will continue to read the logs on a sample rate. During this it will pick up any changes from the server logs and add them to the indexed logs. It is then available for you to “enrich”, which is to basically identify columns in your data based on a sample set of data. After you have done this you can to generate a “report” (which for our intentions is a ODBC data source). Connect Tableau Desktop to your Splunk server using the Splunk drivers and you will then have a live feed from your logs which is available to analyse.

The Implementation

So, I have skimmed over the basics so far and a few of you may be wondering what steps need to be taken to implement this yourself. I have outlined them below. There will be other posts soon which show greater detail in searches, etc…

Here are some prerequisites:

  • Tableau Server (and admin rights)
  • Tableau Desktop
  • Splunk Enterprise
  • Splunk Forwarder
  • Splunk ODBC connector

Assuming Tableau Server is set up, is running and that you have a copy of Tableau Desktop I will continue.

1. Install Splunk Forwarder. 

To start using Splunk you will need to install the forwarder onto your Tableau Server. This can be found on the Splunk website. The instructions on how to do this can be found here: Install Forwarder. You will need to enter the name of your Splunk server and select the file location of the logs for Tableau Server. These logs reside in the ‘data’ directory of Tableau server (see my other posts for details). I would also select the performance monitoring data which is an option for you in the install wizard, as this will mean that you can now turn off your performance monitoring that I blogged about previously. This is fairly simple although you may need the assistance of your Splunk administrator to set up your Splunk index if you do not have access to do so yourself. The Splunk index is the location that your logs will be recorded to on the Splunk Server. A good name for your Index is “Tableau”. I am not a Splunk administrator so I had help with this bit. When my Splunk administrator confirmed the index was setup I completed the install.

2. Search & Reports.

Once your forwarder is picking up data you will need to go to the Splunk search in your browser. This is the URL for your Splunk server. You will be initially given a search window. This window is the driver for all Splunk queries. It will allow you to search through your data. It will also allow you to create reports in the “Save As” option.

This is a complicated bit and is the basis of all of your data sources, so time needs to be taken to make sure it is setup right. In the search window start by entering “index=tableau” (assuming your index is called Tableau). This will start returning your data. If you have set this up on multiple servers or still can’t find it, enter “host=[your tableau server]”. This should sort out any issues with index names, etc.. As a start, save these results as your first report (eg “Tableau – All”) and we will proceed.

3. Setup the Client connection. 

Assuming that my lack of detail in step 1 hasn’t stumped you too much, you will need to install the ODBC connector on your server (so you can publish your data source) and install it on your desktop so you can build your new Splunk data source. This install will need the name of your server, the port (8089) and your user credentials for Splunk. Again it is a wizard so should be fairly straight forward.

4. Connect to your new Splunk data. 

Once your ODBC connector is set up you will be good-to-go. Open your Tableau Desktop and create a new data source. In your server list will be Splunk. Select this and continue. Enter your server details as out would a normal server. Server name, port (8089), user and password. You will now be able to see the reports. If you have done step 2 then you should find a report called “Tableau – All”. This will contain all of the data from the index and can be used as a table style data source. There can be issues with the raw event data not returning. In this event create a new column in your search dialog box. I will do a follow up on this soon to go into more detail.

The results 

So you have created a datasource… It’s now time for your creative Tableau side to shine. Create a few extra fields in Tableau that use the “contains” function and look for “error”, “warn” or “fatal” within an IF statement. These will be of interest to you as an admin.

I will post some examples of Splunk searches and some dashboards as and when I can get sign off to do so, so in the meantime, have fun!

Implementing Kerberos for Tableau

If you’re not looking to implement Kerberos for Tableau Server and you’re happy with NTLM for Active Directory then you’re probably not going to be reading post to the end, but in the event that you are interested here are a few little questions and answers which may help your implementation. Kerberos is available in Tableau Server 8.3 onwards. 

Why Kerberos?

Kerberos, named after the 3 headed Greek mythological guard dog is a method of authentication for Active Directory. It is generally accepted as a more secure method of authentication because of its encrypted ticketing process which makes it harder to impersonate users. Because of this extra security it is sometimes an IT requirement when dealing with sensitive data.

Domain Pre-check 

Your domain (for what we are interested in) is the network on which your server resides, however it may not be the URL used to access your servers. Your Tableau server may be accessed from https://mytableauserver.mycompany.com but it may actually be located on domain mysubdomain.net and there is a DNS routing traffic via an alias. This means that your server is actually mytableauserver.mysubdomain.net – this is also known as your FDQN or Fully Qualified Domain Name and needs to be noted for security (not only for Kerberos but for certain SSL applications). If this is the case you could face issues when setting up Kerberos, so it is my suggestion to test this first. It takes a moment and can save a lot of troubleshooting moving forward.

Reverse Lookups

If you are not sure of your FDQN then there is a simple test that can be run from your desktop computer. A reverse lookup. This is a simple process that can usually be run without any special permissions. 

To run this test, open Command Prompt and run the following (changing the server name):

Nslookup mytableauserver.mycompany.com

Find the non-authoritive IP address (usually the second part of the response) and then run another lookup on the up address to give the server name.

Nslookup 16.132.168.15

Again, the non-authoritive response is what you want and will contain the FDQN with the actual domain your server is sitting on.

In a perfect world, you have just done that and the result of the reverse lookup tells you that you already knew the FDQN and have been using it all along (it’s alright for some!).

If you have just discovered that your server is actually sitting on another domain you will need to contact your DNS support and have a change made to your infoblox entry to match the assumed (original) FDQN, making sure that the domains have a full trust implemented. If they don’t, you may find yourself raising a Tableau Case when you try to implement Kerberos. 

Assuming everything has gone well thus far, it’s onto the next (and documented) steps…

Opening your Tableau Configuration window on the server you will find a tab called “Kerberos”. This is gong to be what generates the main scripts for your Kerberos implementation.

After ticking the “Enable Kerberos” box you can then select the button to generate the Kerberos batch file. This file contains the commands to set up the Kerberos handshake. You will now need to employ the help of your Active Directory admin as there are some things not even a Tableau admin can do!

In the generated batch file it will contain:

  • Your service account password parameter
  • Your keytab output location 
  • Relevant set SPN commands 
  • Your ktpass command

So what is an SPN? 

An SPN is a Service Principle Name. These are the URLs from which you get your incoming Tableau traffic, e.g. mytableauserver.mycompany.com.

An example of a command is:

setspn HTTP/mytableauserver.mycompany.com mycompany\my_service_acc

You may wish to add more SPNs based on your environment. You may have particular DNS settings directing traffic in the event of an outage or something similar. Adding it now will mean you don’t need to make any future changes. The addition of SPN’s has to be done by an Active Directory admin.

Ktpass and keytab

Finally, the ktpass command. You can pass this to your Active Directory admin and they can run this for you as part of the batch file or you can run this yourself if you have passed the setspn commands separately. You only need to be an Active Directory user to run this command so if you want to keep your service account password top secret, this may be the best option.

If you are running this yourself you will want to replace the password parameter with your actual password and enter the output location to somewhere you can access. My suggestion is to create a folder in the ‘Tableau Server’ directory called “Kerberos” to contain the file.

When you run the command you may receive a warning relating to the service account not being mapped, this warning can be ignored as the keytab file will still be created.

Testing

Once you have set the SPNs and you have generated your keytab file you can go back to the configuration window in Tableau and select the keytab location and select the test button. You should get an “Ok” message and another message. Take heed to any other messages, but you should not need to set up any delegations unless your domain admin states otherwise. If you need to add delegation, see the Tableau KB article http://kb.tableau.com/articles/knowledgebase/kerberos-delegation-sql

Once you have started your Tableau service you can then test Kerberos by accessing Tableau Server from both your browser and Desktop. It is important to test both as Tableau Desktop can sometimes revert back to NTLM in the event that there is an underlying issue and it may not be reported clearly so can go undetected. The behaviour should be that no credentials are needed when logging into Tableau Server via either method. You can also look at the httpd error logs to identify errors or warnings relating to the Kerberos authentication.

If the expected behaviour is seen and no errors are returned in the logs then it should be safe to assume that Kerberos is active for Tableau Server. For confirmation you can use Klist commands to display your personal Keeberos tickets and you can ask your Active Directory admin to check for issues tickets. 

Tableau Win32 Error

Ok, so 1 of 3 things has happened for you to be reading this blog post (or you just like to read my ramblings!).

1. Your auto licensing check has failed and your Tableau Server is now unlicensed (yet you know you have a license).

2. You’re uninstalling Tableau Server and you have received an error stating that not all components have been uninstalled.

3. You’re installing/reinstalling Tableau Server and it will not initialise (giving you a Win32 error).

This is a frustrating issue that can cause outages for your service and cause a lot of headaches whilst searching for the cause – all is not lost though, there is a cause and there is a solution I believe. As with most of my posts it hasn’t been verified by Tableau and this is just my own opinion…

Cause

Tableau has a process where it checks your license validity between your cold storage (on your server) and Tableau’s online service. This check runs every 15 minutes and is entirely autonomous (usually). If something interrupts this process it can cause a file to become locked called “program”. This is located in the root directory for your Tableau Server install (eg C:\Program Files\Tableau Server\) and contains the handshake for the license check. A cause (I have found) for this issue is an admin remaining logged in (even in a disconnected state) and using tabadmin whilst the service runs a restart under a different account.

Solution

If you have experienced this issue, you can delete the “program” file and restart Tableau services (or reinstall if this is the process you are undertaking) to resolve the initial problem, however there is the root cause of the problem that needs to be addressed. The solution to this is to make sure your maintenance windows are well defined and that admins are completely logged off of the server and are not just in a disconnected state. It can be all too easy for someone to close their RDP session instead of selecting log off, so it needs to be engrained into your normal administration processes.

Tableau Timeouts and the V9 change

Ok, it has been another long time since a post so I thought I would ramble on about what today brought. You may have also noticed that I have moved to WordPress. This is because Webr who originally hosted my blog decided to up their fees (thanks guys).

Back to the Tableau work at hand…

Whilst testing the implementation plan for our Tableau Server V9 rollout I was configuring our custom settings. One of these settings is the Apache timeout value which (for us) was increased to cope with the network latency for our overseas users. This required a small change to the HTTPD template file (\\Tableau Server\[version]\Templates\httpd.conf.templ) by adding a keepalive limit above 5 seconds (which is the default).

Without going into too much detail, after some careful analysis on the number of timeouts (during which I increased the timeout value by 1 second per working day), 10 seconds was selected as the best time to use as it saw the best drop in timeouts without increasing the memory too much  (be aware that increasing it too much will harm your service). If you think you are experiencing timeouts then you can find out by looking at the error.log in the HTTPD folder within the logs.. – that’s my good deed done for today!

…This implementation plan test went well but I noticed a small change that I didn’t really expect… Tableau has (after all of their quirky little messages in the template file) voluntarily upped the timeout value to 8 seconds. This is odd as Tableau (as far as I know) has employed a different technique to stop timeouts by allowing sessions to go into an idle state. Was this part of the plan or was this accidentally left in there, who knows!?

Either way, after some deliberation  it was decided to keep the 10 second timeout (not the new default of 8) even though Tableau has had a number of performance increases as the value represented the network performance and if it’s not broken, don’t fix it!

Tonight I will do a little research into time timeouts as I am not one to leave a question unanswered…

I hope everyone else had a fun-filled Tableau day.

Monitoring Tableau Server

I wanted to post something on monitoring without stepping on toes, so I thought I would go over system performance monitoring. This is a really good way of getting valuable information on your server’s actual performance and Tableau’s consumption of resources.

Apologies if the steps aren’t extremely clear – They are meant to be guidelines, so you may need to look before you click🙂

If you manage a Tableau instance then I am sure you are familiar with the admin views. These are great, but they don’t contain all of the information that you may need. You can go to the PostgreSQL DB and pull data from there, but there is a wealth of information to be obtained elsewhere…

Now, if you are a windows server admin then I am probably teaching you to suck eggs and you can probably skip this blog post… But if you aren’t, continue to read and I hope you learn something that you can use to benefit your Tableau service.

The Performance Monitor

In the Performance Monitor you can set data collectors to output to flat (.csv) files based on what you would like to monitor on your server. This is a very small system overhead for very valuable information. These are used for general window reporting, but they also contain Tableau relative information – information that you and I need to manage a service properly. To get at this information you will need to do the following roughly described steps.

To create your “Tableau” data collector (in windows server), go to:
Start>Programs>Administrative Tools and select Performance Monitor.

On the left of the window you will need to expand “Data collector sets” and right click on “user defined”. Then go to New>Data Collector Set.
Call your data collector set something meaningful… TabPerformance is always a good one.
You will want to manually select your options, so ensure “create manually (advanced)” is selected. Click next to continue.

When you are on the “what type of data do you want to include?” page, select “Performance Counter” as it will contain the best performance monitoring information. You can also collect event information which can be good if you want to create monitoring for errors.. But for this example we aren’t interested. Click next to continue.

Add your performance counters. This is what things do you want to monitor. It may be CPU, memory, I/O, space… The list goes on. In this example we are going to monitor the Tableau specific processes to check on their momory.

Select “Add” and then scroll down the counters list to “Process”. Expand this item and select “Private Bytes”.

Below this window you will see that process names appear. We are only interested in the main Tableau processes, so select the following via the search:
VizQLServer, Backgrounder, Dataserver, TDEServer64, Tabprotosrv, WGServer, Postgres, Tableau, Tabrepo and Tabsvcmonitor. These should give you a really good indication of what is doing what in the Tableau world.
Once these are added, select “OK” and continue. You can select your sample rate which you may want to set to every 30 seconds, but it is up to you.
Click next and select the root directory. Place it in a drive that you can access as a datasource with desktop. If you have desktop installed on server it would make it easier, if not, try a shared drive. You may want to create a folder specifically for your collectors. Select next to continue and then finish without starting… We have some more tinkering before we start monitoring…

You should now have a new collector under the “User Defined” collectors. Right click your lovely new connector and go to “Properties”.
Go to Schedule and add a schedule that runs every day at a specified time. Before your working day is always a good option. You do not want to have an expiry, so don’t click that!

Go to the Stop Condition tab and set a maximum size limit 50-100mb is best in my opinion as it will capture a few days worth of data before it overwrites. Allow the collector to restart when it hits this limit. We do not want to have it continually consume data.

After enabling these, hit “OK” and then select the file for the collector (shown in the large window on the right when your collector is selected).

Right click on the file and then go to properties again. In here you will be able to see everything you are collecting. Change the Log Format to Comma Separated and then change the log name to the name you want your datasource to be. Allow “overwrite” and “circular” in the File tab before applying and closing by the “OK” button.

You are now ready to start your monitoring… Right click on your collector in the left menu and select “Start”.

This will now start pulling data into your log file for the memory consumption of Tableau processes.
From here you will now want to create a datasource in Tableau Desktop, using this log file (as a live connection) and then publish it to your server.

Have fun with your new Tableau monitoring…

Tableau Public in your business

If you know of Tableau then you will probably know of Tableau Public – the free, online community where you can create your great visualisations and share them with the masses.

What you may not be aware of is the implication for Tableau Server users within your company.
Tableau Public can be published to from the same Tableau Desktop client application that your users are publishing from and it is surprisingly easy for there to be an uninformed mistake by the green developer.

By going to ‘Server’ > ‘Tableau Public’> ‘Save to the web’, you have fundamentally just breached every data rule for your company by publishing potentially confidential data online.

I know what a lot of you will be thinking “there is no way someone would do that!”… And you’re probably right that it is a remote chance that the user has created or already has a public account and hasn’t noticed the “public” logo when signing in… But there is a risk – and risks need to be addressed.

Looking online there is one solution provided by Tableau where you can request to change your keys to versions that cannot be used to publish to Tableau Public. This is great, but is not always feasible for a larger organisation that already has a number of keys already in use.

An alternative is to ban the Tableau Public site completely.. But then you lose all of the potential of having that material available for your users to get ideas from.

After looking at the problem for a bit I found that simply banning: “public.tableausoftware.com/manual”means that your users can still consume the content on Tableau Public, but they will not be able to login via Tableau Desktop within your company. This solution was found by looking at Wireshark logs for HTTP traffic whilst authenticating via desktop and a web browser. Where the web uses “.com/auth”, desktop authentication follows a slightly different route of having “.com/manual”.

This has been tested for 8.1 and 8.2.

Support your Tableau users

When you are providing a managed service within Tableau the implementation of support for your users is an important consideration and potentially a costly one! You may choose to send all of your users off for training and foot the bill or you may train internally by by means of something like Tableau Dr sessions and internal training sessions.
The way I see it is you are offering a self service, on-the-fly tool to empower your business and the support should mirror that. No matter how good your relationship is with your preferred trainer, they won’t be able to organise a course as and when a user has a question. However, formalised training is still important, so for those who have more BI developer focused roles and will be using it as a core part of their working day should be considered for the training.

In my experience, the implementation of Tableau Dr sessions fills the gap quite nicely between users who are starting out on their journey with Tableau and the more ‘hard core’ Tableau’ee, whist keeping your team engaged with user content. Remember, these sessions will actually help keep your support team informed of the current events on your servers and will ultimately allow you to manage the server more carefully – beyond admin views and server statistics.

What is a Tableau Dr Session?

The format of these sessions (for me) are: 15-30 minute slots (on average) where an internal Tableau expert from your team can answer a user’s development questions or can provide insight into Tableau best practices. The best practice sessions can also be pro-active and initiated by the ‘Tableau Dr’; reaching out to users with poorly designed and slow performing workbooks and datasources.
I recommend booking time (probably an hour a day) into your diary as your time slot for Tableau Dr sessions as users will want more and more time when they realise that there is someone to help.

Interestingly, you can also find that users will be more adventurous, knowing that there is someone there to help, which ultimately leads to better visualisations.

The top 5 questions I get asked when I run a session:

  1. Why is my dashboard taking so long to load?
  2. I can do this in Excel, why can I do it in Tableau?
  3. How do I blend data sources?
  4. How can I get it so that certain users can only see certain things?
  5. How do I do year on year or month on month?

Keep note of all of your sessions and create your own knowledge base. This information will help you see user progression and will be a good source of information for the team going forward.

I would like to hear about your Tableau support, so if you wanted to message me, I can be found on Twitter.

Thanks!

In the beginning..

This is the start of my Tableau and BI blog. I have been in the BI industry for over 8 years, working for the MoD, Various public services, Insurance, Media and Banking sectors along with other interesting industries.

I have used lots of different BI visualisation tools in my time, including Cognos, SSRS, Microstrategy, QlikView, Omniscope, Business Objects, etc.. Meaning I had a good idea of what I liked and didn’t like.

I moved to using Tableau exclusively for the past 2 years as I felt a good connection to the GUI and the architecture as a whole. Leveraging data from a number of different source systems, including MS SQL (which is by far my favourite as I was previously a MS SQL DW developer), DB2, Oracle, Hadoop and some Teradata I have produced many suite of visualisations.

When I first started (many moons ago) I was as some bright spark put it, a “spreadsheet monkey”, as it was a core part of toolkit in my mixture of excel and MS Access reporting. This allowed me to easily achieve the requirements of my customers due to the flexibility, but the visual element and performance was lacking. This then evolved as I turned my hands to the other tools.

I now provide consult on Tableau related projects across Europe as an architect.
In the course of my blogs I will add links to my Tableau public efforts and things I find interesting, relating to the subject.