JMP® is statistical analysis software with capabilities that span from data access to advanced statistical techniques, with click of a button sharing. The software is interactive and visual, and statistically deep enough to allow users to see and explore data.
$1,320
per year per user
Tableau Cloud
Score 7.9 out of 10
N/A
Tableau Cloud (formerly Tableau Online) is a self-service analytics platform that is fully hosted in the cloud. Tableau Cloud enables users to publish dashboards and invite colleagues to explore hidden opportunities with interactive visualizations and accurate data, from any browser or mobile device.
$15
per month per user
Tableau Desktop
Score 8.4 out of 10
N/A
Tableau Desktop is a data visualization product from Tableau. It connects to a variety of data sources for combining disparate data sources without coding. It provides tools for discovering patterns and insights, data calculations, forecasts, and statistical summaries and visual storytelling.
$1,380
per year (purchased via a Creator license)
Pricing
JMP
Tableau Cloud
Tableau Desktop
Editions & Modules
JMP
$1320
per year per user
Tableau Viewer
$15
per month billed annually per user
Enterprise Viewer
$35
per month billed annually per user
Tableau Explorer
$42
per month billed annually per user
Enterprise Explorer
$70
per month billed annually per user
Tableau Creator
$75
per month billed annually per user
Enterprise Creator
$115
per month billed annually per user
Tableau+
Contact Sales
Tableau Creator License
$115
per month (billed annually) per user
Offerings
Pricing Offerings
JMP
Tableau Cloud
Tableau Desktop
Free Trial
Yes
No
No
Free/Freemium Version
No
No
Yes
Premium Consulting/Integration Services
No
No
Yes
Entry-level Setup Fee
No setup fee
No setup fee
No setup fee
Additional Details
Bulk discounts available.
—
All pricing plans are billed annually. A Creator license includes Tableau Desktop, Tableau Prep Builder, and Tableau Pulse. Discounts sometimes available for volume.
Compared to other, similar programs, JMP is outstanding in ease of use and ability to be used by almost anyone across an organization. It is more fluid, user friendly, and, most importantly, requires no coding experience. The only two areas where it is not as good as …
For me, JMP is the best and easy way to run regressions. I wouldn't use it for other more advanced models. I decided to use it because we got it for free since we are technically an academic institution.
I feel like Tableau is easier to use compared with the SAP Business Objects. Both have a bit of a learning curve but I felt that Tableau Online was still more intuitive and user friendly. Tableau Desktop is a powerhouse. It has a very steep learning curve but once you master …
Tableau Online is much simpler than other Business Intelligence tools such as SAS and SAP Lumira. While SAS allows you to create algorithms to display a set, Tableau Online provides a more friendly user interface for ease of access. Although it does not stack up too well with …
From an analyst point of view, Tableau is the most intuitive tool and it's really easy to use. It's simply the most convenient product and gives the biggest possibilities. Of course, it's more expensive and not all features are necessary for some users. I have chosen Tableau …
Tableau Online is much better at presenting and visualizing and manipulating your data. While Host Analytics is second to none in data consolidation, Tableau has much greater flexibility in exploring that data.
Tableau Desktop is great because it has much more extensive capabilities. Tableau Prep is great for ETL. It makes it easy to aggregate multiple data sources, union, clean, etc. It is easy to QA within Prep, and takes a lot of the guesswork out of troubleshooting issues with …
Like previously mentioned, Online and Desktop were eventually rolled together to be one offering the last time I checked. If you'll be sharing reports with other Tableau users then Desktop would be just fine.
Looker ended up as the winning product due to its easy to use and flexibility. It's easy for nontechnical stakeholders to learn how to create their Explores. But Tableau gives us more flexibility in creating highly customized visualizations so analysts still rely on it.
When weighing the pros and cons of Tableau Online vs. SAP ERP, two key considerations emerged as clear winners. SAP ERP is a powerful data purification tool, but it doesn't measure up to the competition in terms of data presentation. When it came to data visualization and …
Sisense offers a powerful backend database, Elasticube that integrates well with Web Service data sources. Tableau enables better visualization flexibility and functionality without having to write javascript.
Both Tableau Online and BI solutions provide visualizations. In Power BI we choose the visualization first, then drag the data into it. In Tableau, we select the data and switch between visualizations on the fly. It’s easier to jump between visualizations in Tableau. Power BI …
Verified User
Analyst
Chose Tableau Cloud
I think Tableau is better for a bigger firm with more data than MicroStrategy is. While MicroStrategy seems to be more user-friendly in terms of customization on the fly, and I find it a bit better organized - which is simply my preference of organizational style - my …
With Tableau Desktop, it's easy to create a report in the
context quickly. It allows for the seamless management of the data sources,
which is convenient for the data users. Because it is simple to use, it is
Verified User
Analyst
Chose Tableau Desktop
The online and public versions are only good for the hobbyist because they are not secure enough for most business applications. Dapresy is a marketing tool that is supposed to give executives a snapshot of marketing results. It's not very customizable and the results are …
It is very easy to use, we can create numbers of charts through it which I think other tools lack in. Lots of online communities are there which have provided solutions to the basic issues. Its ODS(output delivery system) is also very effective. We can use SQL in it for …
We were interested in expedience at reasonable cost and so didn't do any sort of bakeoff, but tried Tableau first as a potential solution for moving beyond Excel for large scale data analytics. We picked it because it more than met our functional needs at a very reasonable …
Verified User
Analyst
Chose Tableau Desktop
I feel like Tableau is easier to use and offers a greater selection of visualizations. I feel that the dashboards are easier to put together and offer a great amount of flexibility for the end-user. Tableau has an excellent user support group. I find the community to be …
My current work environment uses both Tableau Online, MicroStrategy & SSRS in parallel. Tableau is much closer to the SSRS in terms of visualization tool where as MicroStrategy is an enterprise data modeling and reporting tool.
Based on the use case we use different tools. Here …
Cass evaluated Domo, QlikView and Birst prior to selecting Tableau. It came down to cost (and by a significant margin); the others have relatively high implementation, hosting and other costs. Additionally, based on a recent Gartner "Magic Quadrant", Tableau exceeds all others …
It is perfectly suited for statistical analyses, but I would not recommend JMP for users who do not have a statistical background. As previously stated, the learning curve is exceptionally steep, and I think that it would prove to be too steep for those without statistical background/knowledge
If you're using Tableau as the primary BI tool, then Tableau Cloud is well suited to publish and share the results with a wide(r) audience. It is well suited for various degrees of self-service proficiency, from pure consumers of analytical work to more advanced users who can use web editing for smaller or larger adjustments, and even for desktop power users who will publish their work to Tableau Cloud. It has many good ways to organize the content and make it easily accessible via search, favorites, folders, collections ("playlists for your data"), or history ("recents"). It might not be ideally suited if there are many on-prem sources to be used (even though there are options to connect them) or if you have very special requirements regarding custom server setup, which is limited in a shared cloud environment like Tableau Cloud.
The best scenario is definitely to collect data from several sources and create dedicated dashboards for specific recipients. However, I miss the possibility of explaining these reports in more detail. Sometimes, we order a report, and after half a year, we don't remember the meaning of some data (I know it's our fault as an organization, but the tool could force better practices).
JMP is designed from the ground-up to be a tool for analysts who do not have PhDs in Statistics without in anyway "dumbing down" the level of statistical analysis applied. In fact, JMP operationalizes the most advanced statistical methods. JMP's design is centred on the JMP data table and dialog boxes. It is data focused not jargon-focussed. So, unlike other software where you must choose the correct statistical method (eg. contingency, ANOVA, linear regression, etc.), with JMP you simply assign the columns in a dialog into roles in the analysis and it chooses the correct statistical method. It's a small thing but it reflects the thinking of the developers: analysts know their data and should only have to think about their data. Analyses should flow from there.
JMP makes most things interactive and visual. This makes analyses dynamic and engaging and obviates the complete dependence on understanding p-values and other statistical concepts(though they are all there) that are often found to be foreign or intimidating.
One of the best examples of this is JMP's profiler. Rather than looking at static figures in a spreadsheet, or a series of formulas, JMP profiles the formulas interactively. You can monitor the effect of changing factors (Xs) and see how they interact with other factors and the responses. You can also specify desirability (maximize, maximize, match-target) and their relative importances to find factor settings that are optimal. I have spent many lengthy meetings working with the profiler to review design and process options with never a dull moment.
The design of experiments (DOE) platform is simply outstanding and, in fact, the principal developers of it have won several awards. Over the last 15 years, using methods broadly known as an "exchange algorithm," JMP can create designs that are far more flexible than conventional designs. This means, for example, that you can create a design with just the interactions that are of interest; you can selectively choose those interactions that are not of interest and drop collecting their associated combinations.
Classical designs are rigid. For example, a Box-Benhken or other response surface design can have only continuous factors. What if you want to investigate these continuous factors along with other categorical factors such as different categorical variables such as materials or different furnace designs and look at the interaction among all factors? This common scenario cannot be handled with conventional designs but are easily accommodated with JMP's Custom DOE platform.
The whole point of DOE is to be able to look at multiple effects comprehensively but determine each one's influence in near or complete isolation. The custom design platform, because it produces uniques designs, provides the means to evaluate just how isolated the effects are. This can be done before collecting data because this important property of the DOE is a function of the design, not the data. By evaluating these graphical reports of the quality of the design, the analyst can make adjustments, adding or reducing runs, to optimize cost, effort and expected learnings.
Over the last number of releases of JMP, which appear about every 18 months now, they have skipped the dialog boxes to direct, drag-and-drop analyses for building graphs and tables as well as Statistical Process Control Charts. Interactivity such as this allows analysts to "be in the moment." As with all aspects of JMP, they are thinking of their subject matter without the cumbersomeness associated with having to think about statistical methods. It's rather like a CEO thinking about growing the business without having to think about every nuance and intricacy of accounting. The statistical thinking is burned into the design of JMP.
Without data analysis is not possible. Getting data into a situation where it can be analyzed can be a major hassle. JMP can pull data from a variety of sources including Excel spreadsheets, CSV, direct data feeds and databases via ODBC. Once the data is in JMP it has all the expected data manipulation capabilities to form it for analysis.
Back in 2000 JMP added a scripting language (JMP Scripting Language or JSL for short) to JMP. With JSL you can automate routine analyses without any coding, you can add specific analyses that JMP does not do out of the box and you can create entire analytical systems and workflows. We have done all three. For example, one consumer products company we are working with now has a need for a variant of a popular non-parametric analysis that they have employed for years. This method will be found in one of the menus and appear as if it were part of JMP to begin with. As for large systems, we have written some that are tens of thousands of lines that take the form of virtual labs and process control systems among others.
JSL applications can be bundled and distributed as JMP Add-ins which make it really easy for users to add to their JMP installation. All they need to do is double-click on the add-in file and it's installed. Pharmaceutical companies and others who are regulated or simply want to control the JMP environment can lock-down JMP's installation and prevent users from adding or changing functionality. Here, add-ins can be distributed from a central location that is authorized and protected to users world-wide.
JMP's technical support is second to none. They take questions by phone and email. I usually send email knowing that I'll get an informed response within 24 hours and if they cannot resolve a problem they proactively keep you informed about what is being done to resolve the issue or answer your question.
Tableau Online is completely cloud based and that's why the reports and dashboards are accessible even on the go. One doesn't always need to access the office laptop to access the reports.
The visualizations are interactive and one can quickly change the level at which they want to view the information. For example, one person might be more interested in looking at the country level performances rather than client level. This is intuitive and one doesn't need to create multiple reports for the same.
The feature to ask questions in plain vanilla English language is great and helpful. For quick adhoc fact checks one can simply type what they are looking for and the Natural Language Programming algorithms under the hood parse the query, interpret it and then fetch the results accordingly in a visual form.
An excellent tool for data visualization, it presents information in an appealing visual format—an exceptional platform for storing and analyzing data in any size organization.
Through interactive parameters, it enables real-time interaction with the user and is easy to learn and get support from the community.
In general JMP is much better fit for a general "data mining" type application. If you want a specific statistics based toolbox, (meaning you just want to run some predetermined test, like testing for a different proportion) then JMP works, but is not the best. JMP is much more suited to taking a data set and starting from "square 1" and exploring it through a range of analytics.
The CPK (process capability) module output is shockingly poor in JMP. This sticks out because, while as a rule everything in JMP is very visual and presentable, the CPK graph is a single-line-on-grey-background drawing. It is not intuitive, and really doesn't tell the story. (This is in contrast with a capability graph in Minitab, which is intuitive and tells a story right off.) This is also the case with the "guage study" output, used for mulivary analysis in a Six Sigma project. It is not intuitive and you need to do a lot of tweaking to make the graph tell you the story right off. I have given this feedback to JMP, and it is possible that it will be addressed in future versions.
I've never heard of JMP allowing floating licenses in a company. This will ALWAYS be a huge sticking point for small to middle size companies, that don't have teams people dedicated to analytics all day. If every person that would do problem solving needs his/her own seat, the cost can be prohibitive. (It gets cheaper by the seat as you add licenses, but for a small company that might get no more than 5 users, it is still a hard sell.)
JMP has been good at releasing updates and adding new features and their support is good. Analytics is quick and you don't need scripting/programming experience. It has been used organization wide, and works well in that respect. Open source means that there are concerns regarding timely support. Cheap licensing and easy to maintain.
Our use of Tableau Desktop is still fairly low, and will continue over time. The only real concern is around cost of the licenses, and I have mentioned this to Tableau and fully expect the development of more sensible models for our industry. This will remove any impediment to expansion of our use.
The GUI interface makes it easier to generate plots and find statistics without having to write code. The JSL scripting is a bit of a steep learning curve but does give you more ability to customize your analysis. Overall, I would recommend JMP as a good product for overall usability.
Based on comments from our clients, I awarded it this grade. Non-technical customers frequently compliment us on the ease with which they can utilize Tableau Online. Usability is rarely a source of contention amongst our customers. Few complaints have come from me as a user of our internal products.
Tableau Desktop has proven to be a lifesaver in many situations. Once we've completed the initial setup, it's simple to use. It has all of the features we need to quickly and efficiently synthesize our data. Tableau Desktop has advanced capabilities to improve our company's data structure and enable self-service for our employees.
When used as a stand-alone tool, Tableau Desktop has unlimited uptime, which is always nice. When used in conjunction with Tableau Server, this tool has as much uptime as your server admins are willing to give it. All in all, I've never had an issue with Tableau's availability.
Tableau Desktop's performance is solid. You can really dig into a large dataset in the form of a spreadsheet, and it exhibits similarly good performance when accessing a moderately sized Oracle database. I noticed that with Tableau Desktop 9.3, the performance using a spreadsheet started to slow around 75K rows by about 60 columns. This was easily remedied by creating an extract and pushing it to Tableau Server, where performance went to lightning fast
Support is great and give ease of contact, rapid response, and willingness to 'stick to the task' until resolution or acknowledgement that the problem would have to be resolved in a future build. Basically, one gets the very real sense that another human being is sensitive to your problems - great or small.
I have not had any issues that require customer support from Tableau at this time, which speaks well to Tableau. I have taken an online course with Tableau and it was very professional and well done, so based on that I would assume a similar level of quality for their customer service.
Tableau support has been extremely responsive and willing to help with all of our requests. They have assisted with creating advanced analysis and many different types of custom icons, data formatting, formulas, and actions embedded into graphs. Tableau offers a weekly presentation of features and assists with internal company projects.
It is admittedly hard to train a group of people with disparate levels of ability coming in, but the software is so easy to use that this is not a huge problem; anyone who can follow simple instructions can catch up pretty quickly.
I think the training was good overall, but it was maybe stating the obvious things that a tech savvy young engineer would be able to pick up themselves too. However, the example work books were good and Tableau web community has helped me with many problems
Again, training is the key and the company provides a lot of example videos that will help users discover use cases that will greatly assist their creation of original visualizations. As with any new software tool, productivity will decline for a period. In the case of Tableau, the decline period is short and the later gains are well worth it.
MS Excel with AnalysisToolPak provides a home-grown solution, but requires a high degree of upkeep and is difficult to hand off. Minitab is the closes competitor, but JMP is better suited to the production environment, roughly equivalent in price, and has superior support.
In determining whether to go with Tableau Online versus Alteryx, two important factors stood out in determining our go-to solution. First, while Alteryx is an impressive tool for data cleansing, it did not stack up in terms of data visualization capabilities. Tableau, on the other hand, provided us everything we needed in terms of visualizing our data and analytics. The second factor is cost. Well neither solution would be considered cheap, Tableau was the more cost effective solution for our needs.
I have used Power BI as well, the pricing is better, and also training costs or certifications are not that high. Since there is python integration in Power BI where I can use data cleaning and visualizing libraries and also some machine learning models. I can import my python scripts and create a visualization on processed data.
Tableau Desktop's scaleability is really limited to the scale of your back-end data systems. If you want to pull down an extract and work quickly in-memory, in my application it scaled to a few tens of millions of rows using the in-memory engine. But it's really only limited by your back-end data store if you have or are willing to invest in an optimized SQL store or purpose-built query engine like Veritca or Netezza or something similar.
ROI: Even if the cost can be high, the insights you get out of the tool would definitely be much more valuable than the actual cost of the software. In my case, most of the results of your analysis were shown to the client, who was blown away, making the money spent well worth for us.
Potential negative: If you are not sure your team will use it, there's a chance you will just waste money. Sometimes the IT department (usually) tries to deploy a better tool for the entire organization but they keep using the old tool they are used too (most likely MS Excel).
Tableau was acquired years ago, and has provided good value with the content created.
Ongoing maintenance costs for the platform, both to maintain desktop and server licensing has made the continuing value questionable when compared to other offerings in the marketplace.
Users have largely been satisfied with the content, but not with the overall performance. This is due to a combination of factors including the performance of the Tableau engines as well as development deficiencies.