Likelihood to Recommend For doing ETL and data transformations where your data is already in a cloud environment Alteryx Designer Cloud is ideal. It allows for the continued democratisation of data and analytics to permeate more extensively throughout a business, particularly those that are already in the cloud or have cloud services in place and are comfortable with procurement and consumption of these systems over and above desktop software. The ETL processing for data validation and wrangling uses machine learning to enable highly efficient data manipulation which saves lots of time and this is a distinct advantage over the desktop software. The expansion and integration of other Alteryx cloud services such as ML, Metrics Store and App builder will only strengthen this position
Read full review It is perfectly suited for statistical analyses, but I would not recommend JMP for users who do not have a statistical background. As previously stated, the learning curve is exceptionally steep, and I think that it would prove to be too steep for those without statistical background/knowledge
Read full review Pros Combining large data sets - these can be compressed into database files to make for faster runs. Formatting data - this can take 45 min of formatting an excel sheet and run it in 7 seconds. Sending large batch emails - some workflows we have send out 50+ emails to customers in just minutes. Read full review JMP is designed from the ground-up to be a tool for analysts who do not have PhDs in Statistics without in anyway "dumbing down" the level of statistical analysis applied. In fact, JMP operationalizes the most advanced statistical methods. JMP's design is centred on the JMP data table and dialog boxes. It is data focused not jargon-focussed. So, unlike other software where you must choose the correct statistical method (eg. contingency, ANOVA, linear regression, etc.), with JMP you simply assign the columns in a dialog into roles in the analysis and it chooses the correct statistical method. It's a small thing but it reflects the thinking of the developers: analysts know their data and should only have to think about their data. Analyses should flow from there. JMP makes most things interactive and visual. This makes analyses dynamic and engaging and obviates the complete dependence on understanding p-values and other statistical concepts(though they are all there) that are often found to be foreign or intimidating. One of the best examples of this is JMP's profiler. Rather than looking at static figures in a spreadsheet, or a series of formulas, JMP profiles the formulas interactively. You can monitor the effect of changing factors (Xs) and see how they interact with other factors and the responses. You can also specify desirability (maximize, maximize, match-target) and their relative importances to find factor settings that are optimal. I have spent many lengthy meetings working with the profiler to review design and process options with never a dull moment. The design of experiments (DOE) platform is simply outstanding and, in fact, the principal developers of it have won several awards. Over the last 15 years, using methods broadly known as an "exchange algorithm," JMP can create designs that are far more flexible than conventional designs. This means, for example, that you can create a design with just the interactions that are of interest; you can selectively choose those interactions that are not of interest and drop collecting their associated combinations. Classical designs are rigid. For example, a Box-Benhken or other response surface design can have only continuous factors. What if you want to investigate these continuous factors along with other categorical factors such as different categorical variables such as materials or different furnace designs and look at the interaction among all factors? This common scenario cannot be handled with conventional designs but are easily accommodated with JMP's Custom DOE platform. The whole point of DOE is to be able to look at multiple effects comprehensively but determine each one's influence in near or complete isolation. The custom design platform, because it produces uniques designs, provides the means to evaluate just how isolated the effects are. This can be done before collecting data because this important property of the DOE is a function of the design, not the data. By evaluating these graphical reports of the quality of the design, the analyst can make adjustments, adding or reducing runs, to optimize cost, effort and expected learnings. Over the last number of releases of JMP, which appear about every 18 months now, they have skipped the dialog boxes to direct, drag-and-drop analyses for building graphs and tables as well as Statistical Process Control Charts. Interactivity such as this allows analysts to "be in the moment." As with all aspects of JMP, they are thinking of their subject matter without the cumbersomeness associated with having to think about statistical methods. It's rather like a CEO thinking about growing the business without having to think about every nuance and intricacy of accounting. The statistical thinking is burned into the design of JMP. Without data analysis is not possible. Getting data into a situation where it can be analyzed can be a major hassle. JMP can pull data from a variety of sources including Excel spreadsheets, CSV, direct data feeds and databases via ODBC. Once the data is in JMP it has all the expected data manipulation capabilities to form it for analysis. Back in 2000 JMP added a scripting language (JMP Scripting Language or JSL for short) to JMP. With JSL you can automate routine analyses without any coding, you can add specific analyses that JMP does not do out of the box and you can create entire analytical systems and workflows. We have done all three. For example, one consumer products company we are working with now has a need for a variant of a popular non-parametric analysis that they have employed for years. This method will be found in one of the menus and appear as if it were part of JMP to begin with. As for large systems, we have written some that are tens of thousands of lines that take the form of virtual labs and process control systems among others. JSL applications can be bundled and distributed as JMP Add-ins which make it really easy for users to add to their JMP installation. All they need to do is double-click on the add-in file and it's installed. Pharmaceutical companies and others who are regulated or simply want to control the JMP environment can lock-down JMP's installation and prevent users from adding or changing functionality. Here, add-ins can be distributed from a central location that is authorized and protected to users world-wide. JMP's technical support is second to none. They take questions by phone and email. I usually send email knowing that I'll get an informed response within 24 hours and if they cannot resolve a problem they proactively keep you informed about what is being done to resolve the issue or answer your question. Read full review Cons New tools to match the ones existing in Designer on-premise. Environments management. Environment variables handling. Read full review In general JMP is much better fit for a general "data mining" type application. If you want a specific statistics based toolbox, (meaning you just want to run some predetermined test, like testing for a different proportion) then JMP works, but is not the best. JMP is much more suited to taking a data set and starting from "square 1" and exploring it through a range of analytics. The CPK (process capability) module output is shockingly poor in JMP. This sticks out because, while as a rule everything in JMP is very visual and presentable, the CPK graph is a single-line-on-grey-background drawing. It is not intuitive, and really doesn't tell the story. (This is in contrast with a capability graph in Minitab, which is intuitive and tells a story right off.) This is also the case with the "guage study" output, used for mulivary analysis in a Six Sigma project. It is not intuitive and you need to do a lot of tweaking to make the graph tell you the story right off. I have given this feedback to JMP, and it is possible that it will be addressed in future versions. I've never heard of JMP allowing floating licenses in a company. This will ALWAYS be a huge sticking point for small to middle size companies, that don't have teams people dedicated to analytics all day. If every person that would do problem solving needs his/her own seat, the cost can be prohibitive. (It gets cheaper by the seat as you add licenses, but for a small company that might get no more than 5 users, it is still a hard sell.) Read full review Likelihood to Renew JMP has been good at releasing updates and adding new features and their support is good. Analytics is quick and you don't need scripting/programming experience. It has been used organization wide, and works well in that respect. Open source means that there are concerns regarding timely support. Cheap licensing and easy to maintain.
Read full review Usability I feel like the usability works for every level of computer user. There are very simple tools that can accomplish a huge amount of time savings, and there are very complex tools that can give you even more. We mainly use just the simple tools in our company and have gained an incredible amount of time savings
Read full review The overall usability of JMP is extremely good. What I really love about it is its ability to be useable for novices who have no coding experience, which is not the case with most other, similar, programs. It can output a fast and easy analysis without too much prior coding or statistical knowledge.
Read full review Support Rating Support is great and give ease of contact, rapid response, and willingness to 'stick to the task' until resolution or acknowledgement that the problem would have to be resolved in a future build. Basically, one gets the very real sense that another human being is sensitive to your problems - great or small.
Read full review Online Training I have not used your online training. I use JMP manuals and SAS direct help.
Read full review Alternatives Considered Our IT group presented Trifacta to me. They picked it out and among the crowd for me. This was way beyond ETL tools, SQL, or any programming language. It's just much faster and because it's sitting on Hadoop you bypass some of the slowdowns typically faced with RDBMS.
Read full review It is great because it has UI menus but it costs money whereas the other programs are free. That makes it ideal for beginners but I think that
RStudio and Python are going to make someone a lot more marketable for future opportunities since most companies won't pay for the software when there is a great free option.
Read full review Return on Investment It makes it easier to hire staff for data cleaning. Read full review ROI: Even if the cost can be high, the insights you get out of the tool would definitely be much more valuable than the actual cost of the software. In my case, most of the results of your analysis were shown to the client, who was blown away, making the money spent well worth for us. Potential negative: If you are not sure your team will use it, there's a chance you will just waste money. Sometimes the IT department (usually) tries to deploy a better tool for the entire organization but they keep using the old tool they are used too (most likely MS Excel). Read full review ScreenShots JMP Statistical Discovery Software from SAS Screenshots