JMP® is statistical analysis software with capabilities that span from data access to advanced statistical techniques, with click of a button sharing. The software is interactive and visual, and statistically deep enough to allow users to see and explore data.
$1,320
per year per user
Tableau Desktop
Score 8.4 out of 10
N/A
Tableau Desktop is a data visualization product from Tableau. It connects to a variety of data sources for combining disparate data sources without coding. It provides tools for discovering patterns and insights, data calculations, forecasts, and statistical summaries and visual storytelling.
$1,380
per year (purchased via a Creator license)
Tableau Server
Score 7.6 out of 10
N/A
Tableau Server allows Tableau Desktop users to publish dashboards to a central server to be shared across their organizations. The product is designed to facilitate collaboration across the organization. It can be deployed on a server in the data center, or it can be deployed on a public cloud.
$12
Per User Per Month
Pricing
JMP
Tableau Desktop
Tableau Server
Editions & Modules
JMP
$1320
per year per user
Tableau Creator License
$115
per month (billed annually) per user
Viewer
$12.00
Per User Per Month
Explorer
$35.00
Per User Per Month
Creator
$70.00
Per User Per Month
Offerings
Pricing Offerings
JMP
Tableau Desktop
Tableau Server
Free Trial
Yes
No
Yes
Free/Freemium Version
No
Yes
No
Premium Consulting/Integration Services
No
Yes
Yes
Entry-level Setup Fee
No setup fee
No setup fee
No setup fee
Additional Details
Bulk discounts available.
All pricing plans are billed annually. A Creator license includes Tableau Desktop, Tableau Prep Builder, and Tableau Pulse. Discounts sometimes available for volume.
Compared to other, similar programs, JMP is outstanding in ease of use and ability to be used by almost anyone across an organization. It is more fluid, user friendly, and, most importantly, requires no coding experience. The only two areas where it is not as good as …
For me, JMP is the best and easy way to run regressions. I wouldn't use it for other more advanced models. I decided to use it because we got it for free since we are technically an academic institution.
When we first looked at getting a visualization software for analytics we looked into two options Microsoft Power BI and Tableau Desktop, and even though Power BI is more cost-effective we decided to go with Tableau Desktop as it had more options that we are looking for such as …
Microsoft PowerBI could potentially be a better fit for organizations on Office365, it's a close call though. Google Data Studio has potential but is still far behind Tableau on the "user-friendly" factor. Tableau still seems to dominate for the "recommended" analytics tool, …
It is very easy to use, we can create numbers of charts through it which I think other tools lack in. Lots of online communities are there which have provided solutions to the basic issues. Its ODS(output delivery system) is also very effective. We can use SQL in it for …
Tableau is next generation tool where other two are old traditional BI tools Other tools are very slow and difficult to use, and required lot of technical expertise to use them. Tableau's look and feel is much nicer than those two.
Against the usual incumbents within the pharmaceutical industry, Tableau has much better and faster access to database data especially stored in the Oracle database, without needing any interim transformations or data universe needing to be created. Also it has comparatively …
If any changes had to be made to existing visualizations when we used QlikView, a lot of security constraints existed and I had to run to the IT team for every change I had to implement.
Tableau gives easy security change rights to the developer environment.
I feel like Tableau is easier to use and offers a greater selection of visualizations. I feel that the dashboards are easier to put together and offer a great amount of flexibility for the end-user. Tableau has an excellent user support group. I find the community to be …
My current work environment uses both Tableau Online, MicroStrategy & SSRS in parallel. Tableau is much closer to the SSRS in terms of visualization tool where as MicroStrategy is an enterprise data modeling and reporting tool.
Based on the use case we use different tools. Here …
I have used SSRS, Crystal Reports, Microsoft Excel, and Business Objects. Tableau offers more functionality than the rest and is pretty intuitive. I think SSRS is the easiest to use. Query speed is excellent with SSRS (at least when you are connected to SQL Server). Microsoft …
In comparison to Tableau, the other dashboarding/BI tools I've used feel clunky, are very slow to develop in, and seem to lack features of a more modernized tool like Tableau. In Pentaho Analyzer, for instance, trying to include multiple worksheets or reports in a single …
As far as I know, we do not currently use Domo, however I've seen some demos of their product. Domo is very good with cloud-based software and it also incorporates social media data. Domo is also good at using cloud-based excel file building vs. building spreadsheets on my …
Renowned digital analytics consultant, innovator, speaker, thought leader
Chose Tableau Desktop
I haven't used other tools for a number of years - when I made the selection my criteria were ease of use (including, slicing & dicing data at will), connectivity to various data sources (especially REST API - which Tableau doesn't support natively but now has a way to use …
Cass evaluated Domo, QlikView and Birst prior to selecting Tableau. It came down to cost (and by a significant margin); the others have relatively high implementation, hosting and other costs. Additionally, based on a recent Gartner "Magic Quadrant", Tableau exceeds all others …
I had the trial version of Tableau Desktop downloaded, installed, configured and was creating meaningful dashboards in almost 15 minutes. While other software we used had great features, none of them were able to compare with this trial experience. Tableau's user forums were …
Verified User
Engineer
Chose Tableau Desktop
Python is programming tools, while Tableau is an easy to use drag and drop data visualization tool. This may not be an apple to apple comparison. Compared to Excel, Tableua is way over the top when it comes to data visualization.
Tableau is by far the superior product when it comes to analysis, ease of use, and end user experience. People are usually more familiar with Excel so it can be difficult to break them out of their comfort zone. Lastly, when it comes to subscriptions, SSRS is the tool I prefer. …
The primary factors for choosing Tableau were the licensing costs; ability to view data from multiple data sources; the ease of infrastructure to setup; and ability for users to create and maintain their own worksheets without the need for IT assistance.
Verified User
Analyst
Chose Tableau Desktop
Tableau Desktop needs no to minimal coding experience. It easily integrates with various data sources. It is very easy to create usable smart reports.
We evaluated QlikView and Tableau for a Fortune 500 corporation 15 months ago in full disclosure. To be all too brief we found QlikView to be a very good tool but more IT dependent than Tableau. Newer features and functions may offset some this one significance.
Tableau is the most powerful and easy to use of the alternatives, as long as the data sources are properly connected. None of the other tools have allowed us to connect and integrate data into one report in the way that Tableau's data connectivity allows us to. Then the ability …
We have evaluated QlikView as well. QlikView is another tool in the "Self-Service BI" world. However, it's focus is mainly on creating ad-hoc data models and using these models to make visualizations. Tableau, on the other hand, will take an existing data model and make much …
The choice to use Tableau Server is really made for you if you already have adopted Tableau Desktop. If you're focused on an on-premise solution, Tableau is probably the way that you'll have to go. Looker and Mode are cloud-based (so is Tableau Online) and offer a true …
There were a lot of reasons why we chose Tableau and the least is the cost but also the way Tableau stores data in the columnar fashion instead of in Cubes. We went through a painstaking selection process and at the end, came down to a couple of vendors and we ended up with the …
We still use Microsoft Excel for much of the lighter, day-to-day pivot tables or calculations. We see Tableau as the future however and are slowly tying more and more of our standard work with Tableau. Smartsheet isn't a 1:1 example, but it was considered for importing …
I did not choose Tableau for my organization, but did choose my organization in part because they use Tableau! Fantastic flexibility combined with relative ease of visualization.
A comprehensive proof of concept study done. We evaluated different vendors and also consider strategic reports (like Gartner) to make a decision. Tableau was the winner. The developers especially liked it, because integrating it to the existing system was very easy.
Sisense was another tool I came across, but I chose Tableau over Sisense as an end-to-end tool for data visualisation and BI. Tableau is the complete data visualisation tool, which is what I was looking for. So, I chose Tableau. Plus, it's easy to use and there are no complex …
Tableau is better than Splunk in analyzing the unstructured data and displaying all relevant information to the user. I have used Splunk but it does not provide the information of every component of a system, it just drills down to log analytics. Tableau is beyond Splunk, as …
QlikView, Tibco Spotfire, SAS, and SAP. At the time, all cost more than Tableau for our (small) needs, SAS and SAP were in some ways overqualified in terms of breadth, and none of them had the ease of use of Tableau.
Tableau Server has many competitors, two primary ones would be SAP Business Objects and Microsoft PowerView through Tabular Analysis Services. I have worked with all three products. First and foremost, in terms of data visualization Tableau is the best by far. However there are …
This search turned up a number of candidates. I think the main alternative considered was SiSense. Tableau Server with Tableau Desktop was the most expensive solution but I was convinced it actually represented the best value.
Tableau by far has the most intuitive interface and best out of the box looks for presentation. The speed of development and ease of development is unbeatable.
QlikView can't connect to live data (in general) Licensing Costs of QlikView and Cognos are expensive. Cognos doesn't have excellent Graphics embedded within the tool
I've personally have used a vast majority of the Business Intelligence products on the marketplace. I've used all of the Oracle products over the past few years. I've used all of the products in the Microsoft stack, along with Cognos, Qlikview, etc. Each are effective if your …
Sr. Data Analyst and Tableau SME for North America
Chose Tableau Server
We also looked at Spotfire and Qilkview
Verified User
Administrator
Chose Tableau Server
Three "self service" BI tools were looked at: Tableau, Spotfire & Qlikview. To put it very simply, Spotfire had a lot of overlap with a tool that was already present at the bank, SAS. QlikView's biggest negative was that everything was brought in via RAM, and there are gigantic …
Vice President of Product Management & Engineering
Chose Tableau Server
We evaluated Tableau Server against all the major players out there. We had a bad experience with one of the major players and switched them out for Tableau. It was one of the best business decisions we have made due to our experience with Tableau and their team. Tableau offers …
It is perfectly suited for statistical analyses, but I would not recommend JMP for users who do not have a statistical background. As previously stated, the learning curve is exceptionally steep, and I think that it would prove to be too steep for those without statistical background/knowledge
The best scenario is definitely to collect data from several sources and create dedicated dashboards for specific recipients. However, I miss the possibility of explaining these reports in more detail. Sometimes, we order a report, and after half a year, we don't remember the meaning of some data (I know it's our fault as an organization, but the tool could force better practices).
Whole funnel and specific channel performance from upper to lower funnel metrics. The ability to view full channel performance for some time, such as weekly, monthly, or quarterly, has truly been monumental in how my team optimizes specific channels and campaigns. Daily performance tracking is a bit overwhelming, with load times and having to refresh specific live views over time. It can be challenging to do so at times, as extensive dashboards take much longer to load.
JMP is designed from the ground-up to be a tool for analysts who do not have PhDs in Statistics without in anyway "dumbing down" the level of statistical analysis applied. In fact, JMP operationalizes the most advanced statistical methods. JMP's design is centred on the JMP data table and dialog boxes. It is data focused not jargon-focussed. So, unlike other software where you must choose the correct statistical method (eg. contingency, ANOVA, linear regression, etc.), with JMP you simply assign the columns in a dialog into roles in the analysis and it chooses the correct statistical method. It's a small thing but it reflects the thinking of the developers: analysts know their data and should only have to think about their data. Analyses should flow from there.
JMP makes most things interactive and visual. This makes analyses dynamic and engaging and obviates the complete dependence on understanding p-values and other statistical concepts(though they are all there) that are often found to be foreign or intimidating.
One of the best examples of this is JMP's profiler. Rather than looking at static figures in a spreadsheet, or a series of formulas, JMP profiles the formulas interactively. You can monitor the effect of changing factors (Xs) and see how they interact with other factors and the responses. You can also specify desirability (maximize, maximize, match-target) and their relative importances to find factor settings that are optimal. I have spent many lengthy meetings working with the profiler to review design and process options with never a dull moment.
The design of experiments (DOE) platform is simply outstanding and, in fact, the principal developers of it have won several awards. Over the last 15 years, using methods broadly known as an "exchange algorithm," JMP can create designs that are far more flexible than conventional designs. This means, for example, that you can create a design with just the interactions that are of interest; you can selectively choose those interactions that are not of interest and drop collecting their associated combinations.
Classical designs are rigid. For example, a Box-Benhken or other response surface design can have only continuous factors. What if you want to investigate these continuous factors along with other categorical factors such as different categorical variables such as materials or different furnace designs and look at the interaction among all factors? This common scenario cannot be handled with conventional designs but are easily accommodated with JMP's Custom DOE platform.
The whole point of DOE is to be able to look at multiple effects comprehensively but determine each one's influence in near or complete isolation. The custom design platform, because it produces uniques designs, provides the means to evaluate just how isolated the effects are. This can be done before collecting data because this important property of the DOE is a function of the design, not the data. By evaluating these graphical reports of the quality of the design, the analyst can make adjustments, adding or reducing runs, to optimize cost, effort and expected learnings.
Over the last number of releases of JMP, which appear about every 18 months now, they have skipped the dialog boxes to direct, drag-and-drop analyses for building graphs and tables as well as Statistical Process Control Charts. Interactivity such as this allows analysts to "be in the moment." As with all aspects of JMP, they are thinking of their subject matter without the cumbersomeness associated with having to think about statistical methods. It's rather like a CEO thinking about growing the business without having to think about every nuance and intricacy of accounting. The statistical thinking is burned into the design of JMP.
Without data analysis is not possible. Getting data into a situation where it can be analyzed can be a major hassle. JMP can pull data from a variety of sources including Excel spreadsheets, CSV, direct data feeds and databases via ODBC. Once the data is in JMP it has all the expected data manipulation capabilities to form it for analysis.
Back in 2000 JMP added a scripting language (JMP Scripting Language or JSL for short) to JMP. With JSL you can automate routine analyses without any coding, you can add specific analyses that JMP does not do out of the box and you can create entire analytical systems and workflows. We have done all three. For example, one consumer products company we are working with now has a need for a variant of a popular non-parametric analysis that they have employed for years. This method will be found in one of the menus and appear as if it were part of JMP to begin with. As for large systems, we have written some that are tens of thousands of lines that take the form of virtual labs and process control systems among others.
JSL applications can be bundled and distributed as JMP Add-ins which make it really easy for users to add to their JMP installation. All they need to do is double-click on the add-in file and it's installed. Pharmaceutical companies and others who are regulated or simply want to control the JMP environment can lock-down JMP's installation and prevent users from adding or changing functionality. Here, add-ins can be distributed from a central location that is authorized and protected to users world-wide.
JMP's technical support is second to none. They take questions by phone and email. I usually send email knowing that I'll get an informed response within 24 hours and if they cannot resolve a problem they proactively keep you informed about what is being done to resolve the issue or answer your question.
An excellent tool for data visualization, it presents information in an appealing visual format—an exceptional platform for storing and analyzing data in any size organization.
Through interactive parameters, it enables real-time interaction with the user and is easy to learn and get support from the community.
It's good at doing what it is designed for: accessing visualizations without having to download and open a workbook in Tableau Desktop. The latter would be a very inefficient method for sharing our metrics, so I am glad that we have Tableau Server to serve this function.
Publishing to Tableau Server is quick and easy. Just a few clicks from Tableau Desktop and a few seconds of publishing through an average speed network, and the new visualizations are live!
Seeing details on who has viewed the visualization and when. This is something particularly useful to me for trying to drive adoption of some new pages, so I really appreciate the granularity provided in Tableau Server
In general JMP is much better fit for a general "data mining" type application. If you want a specific statistics based toolbox, (meaning you just want to run some predetermined test, like testing for a different proportion) then JMP works, but is not the best. JMP is much more suited to taking a data set and starting from "square 1" and exploring it through a range of analytics.
The CPK (process capability) module output is shockingly poor in JMP. This sticks out because, while as a rule everything in JMP is very visual and presentable, the CPK graph is a single-line-on-grey-background drawing. It is not intuitive, and really doesn't tell the story. (This is in contrast with a capability graph in Minitab, which is intuitive and tells a story right off.) This is also the case with the "guage study" output, used for mulivary analysis in a Six Sigma project. It is not intuitive and you need to do a lot of tweaking to make the graph tell you the story right off. I have given this feedback to JMP, and it is possible that it will be addressed in future versions.
I've never heard of JMP allowing floating licenses in a company. This will ALWAYS be a huge sticking point for small to middle size companies, that don't have teams people dedicated to analytics all day. If every person that would do problem solving needs his/her own seat, the cost can be prohibitive. (It gets cheaper by the seat as you add licenses, but for a small company that might get no more than 5 users, it is still a hard sell.)
Tableau Server has had some issue handling some of our larger data sets. Our extract refreshes fail intermittently with no obvious error that we can fix
Tableau Server has been hard to work with before they launched their new Rest API, which is also a little tricky to work with
JMP has been good at releasing updates and adding new features and their support is good. Analytics is quick and you don't need scripting/programming experience. It has been used organization wide, and works well in that respect. Open source means that there are concerns regarding timely support. Cheap licensing and easy to maintain.
Our use of Tableau Desktop is still fairly low, and will continue over time. The only real concern is around cost of the licenses, and I have mentioned this to Tableau and fully expect the development of more sensible models for our industry. This will remove any impediment to expansion of our use.
It simply is used all the time by more and more people. Migrating to something else would involve lots of work and lots of training. The renewal fee being fair, it simply isn't worth migrating to a different tool for now.
The GUI interface makes it easier to generate plots and find statistics without having to write code. The JSL scripting is a bit of a steep learning curve but does give you more ability to customize your analysis. Overall, I would recommend JMP as a good product for overall usability.
Tableau Desktop has proven to be a lifesaver in many situations. Once we've completed the initial setup, it's simple to use. It has all of the features we need to quickly and efficiently synthesize our data. Tableau Desktop has advanced capabilities to improve our company's data structure and enable self-service for our employees.
Tableau Server takes training and experience in order to unlock the application's full potential. This is best handled by a qualified data scientist or data analytics manager. Tableau user interface layout, nomenclature, and command structure take time and training to become proficient with. Integration and connectivity require proper IT developer support.
When used as a stand-alone tool, Tableau Desktop has unlimited uptime, which is always nice. When used in conjunction with Tableau Server, this tool has as much uptime as your server admins are willing to give it. All in all, I've never had an issue with Tableau's availability.
Our instance of Tableau Server was hosted on premises (I believe all instances are) so if there were any outages it was normally due to scheduled maintenance on our end. If the Tableau server ever went down, a quick restart solved most issues
Tableau Desktop's performance is solid. You can really dig into a large dataset in the form of a spreadsheet, and it exhibits similarly good performance when accessing a moderately sized Oracle database. I noticed that with Tableau Desktop 9.3, the performance using a spreadsheet started to slow around 75K rows by about 60 columns. This was easily remedied by creating an extract and pushing it to Tableau Server, where performance went to lightning fast
While there are definitely cases where a user can do things that will make a particular worksheet or dashboard run slowly, overall the performance is extremely fast. The user experience of exploratory analysis particularly shines, there's nothing out there with the polish of Tableau.
Support is great and give ease of contact, rapid response, and willingness to 'stick to the task' until resolution or acknowledgement that the problem would have to be resolved in a future build. Basically, one gets the very real sense that another human being is sensitive to your problems - great or small.
Tableau support has been extremely responsive and willing to help with all of our requests. They have assisted with creating advanced analysis and many different types of custom icons, data formatting, formulas, and actions embedded into graphs. Tableau offers a weekly presentation of features and assists with internal company projects.
We have consistently had highly satisfactory results every time we've reached out for help. Our contractor, used for Tableau server maintenance and dashboard development is very technically skilled. When he hits a roadblock on how to do something with Tableau, the support staff have provided timely and useful guidance. He frequently compares it to Cognos and says that while Cognos has capabilities Tableau doesn't, the bottom line value for us is a no-brainer
It is admittedly hard to train a group of people with disparate levels of ability coming in, but the software is so easy to use that this is not a huge problem; anyone who can follow simple instructions can catch up pretty quickly.
In our case, they hired a private third party consultant to train our dept. It was extremely boring and felt like it dragged on. Everything I learned was self taught so I was not really paying attention. But I do think that you can easily spend a week on the tool and go over every nook and cranny. We only had the consultant in for a day or two.
I think the training was good overall, but it was maybe stating the obvious things that a tech savvy young engineer would be able to pick up themselves too. However, the example work books were good and Tableau web community has helped me with many problems
The Tableau website is full of videos that you can follow at your own pace. As a very small company with a Tableau install, access to these free resources was incredibly useful to allowing me to implement Tableau to its potential in a reasonable and proportionate manner.
Again, training is the key and the company provides a lot of example videos that will help users discover use cases that will greatly assist their creation of original visualizations. As with any new software tool, productivity will decline for a period. In the case of Tableau, the decline period is short and the later gains are well worth it.
Implementation was over the phone with the vendor, and did not go particularly well. Again, think this was our fault as our integration and IT oversight was poor, and we made errors. Would they have happened had a vendor been onsite? Not sure, probably not, but we probably wouldn't have paid for that either
MS Excel with AnalysisToolPak provides a home-grown solution, but requires a high degree of upkeep and is difficult to hand off. Minitab is the closes competitor, but JMP is better suited to the production environment, roughly equivalent in price, and has superior support.
I have used Power BI as well, the pricing is better, and also training costs or certifications are not that high. Since there is python integration in Power BI where I can use data cleaning and visualizing libraries and also some machine learning models. I can import my python scripts and create a visualization on processed data.
Today, if my shop is largely Microsoft-centric, I would be hard pressed to choose a product other than Power BI. Tableau was the visualization leader for years, but Microsoft has caught up with them in many areas, and surpassed them in some. Its ability to source, transform, and model data is superior to Tableau. Tableau still has the lead in some visualizations, but Power BI's rise is evidenced by its ever-increasing position in the leadership section of the Gartner Magic Quadrant.
Tableau Desktop's scaleability is really limited to the scale of your back-end data systems. If you want to pull down an extract and work quickly in-memory, in my application it scaled to a few tens of millions of rows using the in-memory engine. But it's really only limited by your back-end data store if you have or are willing to invest in an optimized SQL store or purpose-built query engine like Veritca or Netezza or something similar.
ROI: Even if the cost can be high, the insights you get out of the tool would definitely be much more valuable than the actual cost of the software. In my case, most of the results of your analysis were shown to the client, who was blown away, making the money spent well worth for us.
Potential negative: If you are not sure your team will use it, there's a chance you will just waste money. Sometimes the IT department (usually) tries to deploy a better tool for the entire organization but they keep using the old tool they are used too (most likely MS Excel).
Tableau was acquired years ago, and has provided good value with the content created.
Ongoing maintenance costs for the platform, both to maintain desktop and server licensing has made the continuing value questionable when compared to other offerings in the marketplace.
Users have largely been satisfied with the content, but not with the overall performance. This is due to a combination of factors including the performance of the Tableau engines as well as development deficiencies.
Tableau does take dedicated FTE to create and analyze the data. It's too complex (and powerful) a product not to have someone dedicated to developing with it.
There are some significant setup for the server product.
Once sever setup is complete, it's largely "fire and forget" until an update is necessary. The server update process is cumbersome.