Apache Kafka is an open-source stream processing platform developed by the Apache Software Foundation written in Scala and Java. The Kafka event streaming platform is used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications.
N/A
Tableau Server
Score 7.6 out of 10
N/A
Tableau Server allows Tableau Desktop users to publish dashboards to a central server to be shared across their organizations. The product is designed to facilitate collaboration across the organization. It can be deployed on a server in the data center, or it can be deployed on a public cloud.
$12
Per User Per Month
Pricing
Apache Kafka
Tableau Server
Editions & Modules
No answers on this topic
Viewer
$12.00
Per User Per Month
Explorer
$35.00
Per User Per Month
Creator
$70.00
Per User Per Month
Offerings
Pricing Offerings
Apache Kafka
Tableau Server
Free Trial
No
Yes
Free/Freemium Version
No
No
Premium Consulting/Integration Services
No
Yes
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
—
More Pricing Information
Community Pulse
Apache Kafka
Tableau Server
Features
Apache Kafka
Tableau Server
BI Standard Reporting
Comparison of BI Standard Reporting features of Product A and Product B
Apache Kafka
-
Ratings
Tableau Server
8.4
95 Ratings
3% above category average
Pixel Perfect reports
00 Ratings
9.129 Ratings
Customizable dashboards
00 Ratings
7.194 Ratings
Report Formatting Templates
00 Ratings
9.081 Ratings
Ad-hoc Reporting
Comparison of Ad-hoc Reporting features of Product A and Product B
Apache Kafka
-
Ratings
Tableau Server
7.8
95 Ratings
3% below category average
Drill-down analysis
00 Ratings
8.095 Ratings
Formatting capabilities
00 Ratings
8.093 Ratings
Integration with R or other statistical packages
00 Ratings
8.059 Ratings
Report sharing and collaboration
00 Ratings
7.189 Ratings
Report Output and Scheduling
Comparison of Report Output and Scheduling features of Product A and Product B
Apache Kafka
-
Ratings
Tableau Server
7.3
91 Ratings
13% below category average
Publish to Web
00 Ratings
8.185 Ratings
Publish to PDF
00 Ratings
7.184 Ratings
Report Versioning
00 Ratings
8.070 Ratings
Report Delivery Scheduling
00 Ratings
8.077 Ratings
Delivery to Remote Servers
00 Ratings
5.19 Ratings
Data Discovery and Visualization
Comparison of Data Discovery and Visualization features of Product A and Product B
Apache Kafka is well-suited for most data-streaming use cases. Amazon Kinesis and Azure EventHubs, unless you have a specific use case where using those cloud PaAS for your data lakes, once set up well, Apache Kafka will take care of everything else in the background. Azure EventHubs, is good for cross-cloud use cases, and Amazon Kinesis - I have no real-world experience. But I believe it is the same.
Whole funnel and specific channel performance from upper to lower funnel metrics. The ability to view full channel performance for some time, such as weekly, monthly, or quarterly, has truly been monumental in how my team optimizes specific channels and campaigns. Daily performance tracking is a bit overwhelming, with load times and having to refresh specific live views over time. It can be challenging to do so at times, as extensive dashboards take much longer to load.
Really easy to configure. I've used other message brokers such as RabbitMQ and compared to them, Kafka's configurations are very easy to understand and tweak.
Very scalable: easily configured to run on multiple nodes allowing for ease of parallelism (assuming your queues/topics don't have to be consumed in the exact same order the messages were delivered)
Not exactly a feature, but I trust Kafka will be around for at least another decade because active development has continued to be strong and there's a lot of financial backing from Confluent and LinkedIn, and probably many other companies who are using it (which, anecdotally, is many).
It's good at doing what it is designed for: accessing visualizations without having to download and open a workbook in Tableau Desktop. The latter would be a very inefficient method for sharing our metrics, so I am glad that we have Tableau Server to serve this function.
Publishing to Tableau Server is quick and easy. Just a few clicks from Tableau Desktop and a few seconds of publishing through an average speed network, and the new visualizations are live!
Seeing details on who has viewed the visualization and when. This is something particularly useful to me for trying to drive adoption of some new pages, so I really appreciate the granularity provided in Tableau Server
Sometimes it becomes difficult to monitor our Kafka deployments. We've been able to overcome it largely using AWS MSK, a managed service for Apache Kafka, but a separate monitoring dashboard would have been great.
Simplify the process for local deployment of Kafka and provide a user interface to get visibility into the different topics and the messages being processed.
Learning curve around creation of broker and topics could be simplified
Tableau Server has had some issue handling some of our larger data sets. Our extract refreshes fail intermittently with no obvious error that we can fix
Tableau Server has been hard to work with before they launched their new Rest API, which is also a little tricky to work with
It simply is used all the time by more and more people. Migrating to something else would involve lots of work and lots of training. The renewal fee being fair, it simply isn't worth migrating to a different tool for now.
Apache Kafka is highly recommended to develop loosely coupled, real-time processing applications. Also, Apache Kafka provides property based configuration. Producer, Consumer and broker contain their own separate property file
Tableau Server takes training and experience in order to unlock the application's full potential. This is best handled by a qualified data scientist or data analytics manager. Tableau user interface layout, nomenclature, and command structure take time and training to become proficient with. Integration and connectivity require proper IT developer support.
Our instance of Tableau Server was hosted on premises (I believe all instances are) so if there were any outages it was normally due to scheduled maintenance on our end. If the Tableau server ever went down, a quick restart solved most issues
While there are definitely cases where a user can do things that will make a particular worksheet or dashboard run slowly, overall the performance is extremely fast. The user experience of exploratory analysis particularly shines, there's nothing out there with the polish of Tableau.
Support for Apache Kafka (if willing to pay) is available from Confluent that includes the same time that created Kafka at Linkedin so they know this software in and out. Moreover, Apache Kafka is well known and best practices documents and deployment scenarios are easily available for download. For example, from eBay, Linkedin, Uber, and NYTimes.
We have consistently had highly satisfactory results every time we've reached out for help. Our contractor, used for Tableau server maintenance and dashboard development is very technically skilled. When he hits a roadblock on how to do something with Tableau, the support staff have provided timely and useful guidance. He frequently compares it to Cognos and says that while Cognos has capabilities Tableau doesn't, the bottom line value for us is a no-brainer
In our case, they hired a private third party consultant to train our dept. It was extremely boring and felt like it dragged on. Everything I learned was self taught so I was not really paying attention. But I do think that you can easily spend a week on the tool and go over every nook and cranny. We only had the consultant in for a day or two.
The Tableau website is full of videos that you can follow at your own pace. As a very small company with a Tableau install, access to these free resources was incredibly useful to allowing me to implement Tableau to its potential in a reasonable and proportionate manner.
Implementation was over the phone with the vendor, and did not go particularly well. Again, think this was our fault as our integration and IT oversight was poor, and we made errors. Would they have happened had a vendor been onsite? Not sure, probably not, but we probably wouldn't have paid for that either
I used other messaging/queue solutions that are a lot more basic than Confluent Kafka, as well as another solution that is no longer in the market called Xively, which was bought and "buried" by Google. In comparison, these solutions offer way fewer functionalities and respond to other needs.
Today, if my shop is largely Microsoft-centric, I would be hard pressed to choose a product other than Power BI. Tableau was the visualization leader for years, but Microsoft has caught up with them in many areas, and surpassed them in some. Its ability to source, transform, and model data is superior to Tableau. Tableau still has the lead in some visualizations, but Power BI's rise is evidenced by its ever-increasing position in the leadership section of the Gartner Magic Quadrant.
Positive: Get a quick and reliable pub/sub model implemented - data across components flows easily.
Positive: it's scalable so we can develop small and scale for real-world scenarios
Negative: it's easy to get into a confusing situation if you are not experienced yet or something strange has happened (rare, but it does). Troubleshooting such situations can take time and effort.
Tableau does take dedicated FTE to create and analyze the data. It's too complex (and powerful) a product not to have someone dedicated to developing with it.
There are some significant setup for the server product.
Once sever setup is complete, it's largely "fire and forget" until an update is necessary. The server update process is cumbersome.