JMP® is statistical analysis software with capabilities that span from data access to advanced statistical techniques, with click of a button sharing. The software is interactive and visual, and statistically deep enough to allow users to see and explore data.
$1,320
per year per user
Tableau Public
Score 9.8 out of 10
N/A
Tableau Public is a free edition of the Desktop product. With this edition, data can only be published to the Tableau public website and does not allow work to be saved or exported locally.
$0
per month
Tableau Server
Score 7.6 out of 10
N/A
Tableau Server allows Tableau Desktop users to publish dashboards to a central server to be shared across their organizations. The product is designed to facilitate collaboration across the organization. It can be deployed on a server in the data center, or it can be deployed on a public cloud.
Tableau Public provides a variety of visualization and point-and-click functionality, with little or no scripting, gives Tableau the advantage. Also, being lightweight, Tableau Public finds the ease of use from our PSU bank-clients that use low-end hardware and devices. Tableau …
We evaluated about 15 products when we selected Tableau 7 years ago, and periodically review products from other vendors (e.g. Microsoft, QlikView, Tibco Spotfire, Birst, Pentaho, etc.). To date, Tableau offers the widest variety of options and functionality at a reasonable …
It is perfectly suited for statistical analyses, but I would not recommend JMP for users who do not have a statistical background. As previously stated, the learning curve is exceptionally steep, and I think that it would prove to be too steep for those without statistical background/knowledge
Tableau public is the best platform to build dashboards for your personal profile and share with recruiters. It's always good to keep ourselves updated on the latest features, create sample dashboards and save them to a personal profile. Tableau public is free and doesn't need any subscription. anyone can create an account and start building reports.
Whole funnel and specific channel performance from upper to lower funnel metrics. The ability to view full channel performance for some time, such as weekly, monthly, or quarterly, has truly been monumental in how my team optimizes specific channels and campaigns. Daily performance tracking is a bit overwhelming, with load times and having to refresh specific live views over time. It can be challenging to do so at times, as extensive dashboards take much longer to load.
JMP is designed from the ground-up to be a tool for analysts who do not have PhDs in Statistics without in anyway "dumbing down" the level of statistical analysis applied. In fact, JMP operationalizes the most advanced statistical methods. JMP's design is centred on the JMP data table and dialog boxes. It is data focused not jargon-focussed. So, unlike other software where you must choose the correct statistical method (eg. contingency, ANOVA, linear regression, etc.), with JMP you simply assign the columns in a dialog into roles in the analysis and it chooses the correct statistical method. It's a small thing but it reflects the thinking of the developers: analysts know their data and should only have to think about their data. Analyses should flow from there.
JMP makes most things interactive and visual. This makes analyses dynamic and engaging and obviates the complete dependence on understanding p-values and other statistical concepts(though they are all there) that are often found to be foreign or intimidating.
One of the best examples of this is JMP's profiler. Rather than looking at static figures in a spreadsheet, or a series of formulas, JMP profiles the formulas interactively. You can monitor the effect of changing factors (Xs) and see how they interact with other factors and the responses. You can also specify desirability (maximize, maximize, match-target) and their relative importances to find factor settings that are optimal. I have spent many lengthy meetings working with the profiler to review design and process options with never a dull moment.
The design of experiments (DOE) platform is simply outstanding and, in fact, the principal developers of it have won several awards. Over the last 15 years, using methods broadly known as an "exchange algorithm," JMP can create designs that are far more flexible than conventional designs. This means, for example, that you can create a design with just the interactions that are of interest; you can selectively choose those interactions that are not of interest and drop collecting their associated combinations.
Classical designs are rigid. For example, a Box-Benhken or other response surface design can have only continuous factors. What if you want to investigate these continuous factors along with other categorical factors such as different categorical variables such as materials or different furnace designs and look at the interaction among all factors? This common scenario cannot be handled with conventional designs but are easily accommodated with JMP's Custom DOE platform.
The whole point of DOE is to be able to look at multiple effects comprehensively but determine each one's influence in near or complete isolation. The custom design platform, because it produces uniques designs, provides the means to evaluate just how isolated the effects are. This can be done before collecting data because this important property of the DOE is a function of the design, not the data. By evaluating these graphical reports of the quality of the design, the analyst can make adjustments, adding or reducing runs, to optimize cost, effort and expected learnings.
Over the last number of releases of JMP, which appear about every 18 months now, they have skipped the dialog boxes to direct, drag-and-drop analyses for building graphs and tables as well as Statistical Process Control Charts. Interactivity such as this allows analysts to "be in the moment." As with all aspects of JMP, they are thinking of their subject matter without the cumbersomeness associated with having to think about statistical methods. It's rather like a CEO thinking about growing the business without having to think about every nuance and intricacy of accounting. The statistical thinking is burned into the design of JMP.
Without data analysis is not possible. Getting data into a situation where it can be analyzed can be a major hassle. JMP can pull data from a variety of sources including Excel spreadsheets, CSV, direct data feeds and databases via ODBC. Once the data is in JMP it has all the expected data manipulation capabilities to form it for analysis.
Back in 2000 JMP added a scripting language (JMP Scripting Language or JSL for short) to JMP. With JSL you can automate routine analyses without any coding, you can add specific analyses that JMP does not do out of the box and you can create entire analytical systems and workflows. We have done all three. For example, one consumer products company we are working with now has a need for a variant of a popular non-parametric analysis that they have employed for years. This method will be found in one of the menus and appear as if it were part of JMP to begin with. As for large systems, we have written some that are tens of thousands of lines that take the form of virtual labs and process control systems among others.
JSL applications can be bundled and distributed as JMP Add-ins which make it really easy for users to add to their JMP installation. All they need to do is double-click on the add-in file and it's installed. Pharmaceutical companies and others who are regulated or simply want to control the JMP environment can lock-down JMP's installation and prevent users from adding or changing functionality. Here, add-ins can be distributed from a central location that is authorized and protected to users world-wide.
JMP's technical support is second to none. They take questions by phone and email. I usually send email knowing that I'll get an informed response within 24 hours and if they cannot resolve a problem they proactively keep you informed about what is being done to resolve the issue or answer your question.
Data visualization: lots of different options, including bar, scatter, pie, waterfall charts to explore relationships between variables, and to present findings/trends to different teams
Integrates readily with limited, though different data sources: TXT, CSV, TDE, Access
Exports reports for review of different dashboards: client-ready/team-ready, with a clean and tidy presentation in PDF format (or hardcopy)
It's good at doing what it is designed for: accessing visualizations without having to download and open a workbook in Tableau Desktop. The latter would be a very inefficient method for sharing our metrics, so I am glad that we have Tableau Server to serve this function.
Publishing to Tableau Server is quick and easy. Just a few clicks from Tableau Desktop and a few seconds of publishing through an average speed network, and the new visualizations are live!
Seeing details on who has viewed the visualization and when. This is something particularly useful to me for trying to drive adoption of some new pages, so I really appreciate the granularity provided in Tableau Server
In general JMP is much better fit for a general "data mining" type application. If you want a specific statistics based toolbox, (meaning you just want to run some predetermined test, like testing for a different proportion) then JMP works, but is not the best. JMP is much more suited to taking a data set and starting from "square 1" and exploring it through a range of analytics.
The CPK (process capability) module output is shockingly poor in JMP. This sticks out because, while as a rule everything in JMP is very visual and presentable, the CPK graph is a single-line-on-grey-background drawing. It is not intuitive, and really doesn't tell the story. (This is in contrast with a capability graph in Minitab, which is intuitive and tells a story right off.) This is also the case with the "guage study" output, used for mulivary analysis in a Six Sigma project. It is not intuitive and you need to do a lot of tweaking to make the graph tell you the story right off. I have given this feedback to JMP, and it is possible that it will be addressed in future versions.
I've never heard of JMP allowing floating licenses in a company. This will ALWAYS be a huge sticking point for small to middle size companies, that don't have teams people dedicated to analytics all day. If every person that would do problem solving needs his/her own seat, the cost can be prohibitive. (It gets cheaper by the seat as you add licenses, but for a small company that might get no more than 5 users, it is still a hard sell.)
Tableau Public (both Desktop and Server) like their "for a fee" counterparts offer very easy to learn and use tools to transform data into pictures and gain insights into your data. Most organizations report a reduction in development time of 10x vs. other similar tools, due to the intuitive user interface. That said, with Tableau Public, published workbooks are "disconnected" from the underlying data sources and require periodic updates when the data changes. Users are limited to 1 Gb of storage space per user ID and password as well.
I would like to see better options for public sharing of visualizations and data from within the "for a fee" products as more and more organizations are moving in the direction of data sharing with partners and their communities.
Tableau Server has had some issue handling some of our larger data sets. Our extract refreshes fail intermittently with no obvious error that we can fix
Tableau Server has been hard to work with before they launched their new Rest API, which is also a little tricky to work with
JMP has been good at releasing updates and adding new features and their support is good. Analytics is quick and you don't need scripting/programming experience. It has been used organization wide, and works well in that respect. Open source means that there are concerns regarding timely support. Cheap licensing and easy to maintain.
It's free, right? I'll keep using the free version. So the real question to ask is this? Will I pay $999 for the Personal version or $1,999 for the Professional? Yikes! That is a big stretch. I'm not sure about that. The product comparison chart is at: http://www.tableausoftware.com/public/comparison
It simply is used all the time by more and more people. Migrating to something else would involve lots of work and lots of training. The renewal fee being fair, it simply isn't worth migrating to a different tool for now.
The GUI interface makes it easier to generate plots and find statistics without having to write code. The JSL scripting is a bit of a steep learning curve but does give you more ability to customize your analysis. Overall, I would recommend JMP as a good product for overall usability.
Tableau public is a great training tool to understand the basics of Tableau before buying it. A great tool to extend Excel's visualization and to publish data for others. Not useful for anything you need secure. No ability to access databases. Static information only.
Tableau Server takes training and experience in order to unlock the application's full potential. This is best handled by a qualified data scientist or data analytics manager. Tableau user interface layout, nomenclature, and command structure take time and training to become proficient with. Integration and connectivity require proper IT developer support.
Our instance of Tableau Server was hosted on premises (I believe all instances are) so if there were any outages it was normally due to scheduled maintenance on our end. If the Tableau server ever went down, a quick restart solved most issues
While there are definitely cases where a user can do things that will make a particular worksheet or dashboard run slowly, overall the performance is extremely fast. The user experience of exploratory analysis particularly shines, there's nothing out there with the polish of Tableau.
Support is great and give ease of contact, rapid response, and willingness to 'stick to the task' until resolution or acknowledgement that the problem would have to be resolved in a future build. Basically, one gets the very real sense that another human being is sensitive to your problems - great or small.
We have consistently had highly satisfactory results every time we've reached out for help. Our contractor, used for Tableau server maintenance and dashboard development is very technically skilled. When he hits a roadblock on how to do something with Tableau, the support staff have provided timely and useful guidance. He frequently compares it to Cognos and says that while Cognos has capabilities Tableau doesn't, the bottom line value for us is a no-brainer
In our case, they hired a private third party consultant to train our dept. It was extremely boring and felt like it dragged on. Everything I learned was self taught so I was not really paying attention. But I do think that you can easily spend a week on the tool and go over every nook and cranny. We only had the consultant in for a day or two.
The Tableau website is full of videos that you can follow at your own pace. As a very small company with a Tableau install, access to these free resources was incredibly useful to allowing me to implement Tableau to its potential in a reasonable and proportionate manner.
Start at the end and work backward. Identify the business case / issue and questions the end users have, then identify the data needed, and where to get it.
Implementation was over the phone with the vendor, and did not go particularly well. Again, think this was our fault as our integration and IT oversight was poor, and we made errors. Would they have happened had a vendor been onsite? Not sure, probably not, but we probably wouldn't have paid for that either
MS Excel with AnalysisToolPak provides a home-grown solution, but requires a high degree of upkeep and is difficult to hand off. Minitab is the closes competitor, but JMP is better suited to the production environment, roughly equivalent in price, and has superior support.
Google Charts/Drive is sufficient for simpler data sets, but it does not integrate with other web platforms and the visualization does not look as professional. I'm not aware of any other competitors that offer the same package as Microsoft.
Today, if my shop is largely Microsoft-centric, I would be hard pressed to choose a product other than Power BI. Tableau was the visualization leader for years, but Microsoft has caught up with them in many areas, and surpassed them in some. Its ability to source, transform, and model data is superior to Tableau. Tableau still has the lead in some visualizations, but Power BI's rise is evidenced by its ever-increasing position in the leadership section of the Gartner Magic Quadrant.
ROI: Even if the cost can be high, the insights you get out of the tool would definitely be much more valuable than the actual cost of the software. In my case, most of the results of your analysis were shown to the client, who was blown away, making the money spent well worth for us.
Potential negative: If you are not sure your team will use it, there's a chance you will just waste money. Sometimes the IT department (usually) tries to deploy a better tool for the entire organization but they keep using the old tool they are used too (most likely MS Excel).
Tableau does take dedicated FTE to create and analyze the data. It's too complex (and powerful) a product not to have someone dedicated to developing with it.
There are some significant setup for the server product.
Once sever setup is complete, it's largely "fire and forget" until an update is necessary. The server update process is cumbersome.