RapidMiner is a data science and data mining platform, from Altair since the late 2022 acquisition. RapidMiner offers full automation for non-coding domain experts, an integrated JupyterLab environment for seasoned data scientists, and a visual drag-and-drop designer. RapidMiner’s project-based framework helps to ensure that others can build off their work using visual workflows or automated data science.
$7,500
Per User Per Month
Tableau Desktop
Score 8.4 out of 10
N/A
Tableau Desktop is a data visualization product from Tableau. It connects to a variety of data sources for combining disparate data sources without coding. It provides tools for discovering patterns and insights, data calculations, forecasts, and statistical summaries and visual storytelling.
$1,380
per year (purchased via a Creator license)
Pricing
RapidMiner
Tableau Desktop
Editions & Modules
Professional
$7,500.00
Per User Per Month
Enterprise
$15,000.00
Per User Per Month
AI Hub
$54,000.00
Per User Per Month
Tableau Creator License
$115
per month (billed annually) per user
Offerings
Pricing Offerings
RapidMiner
Tableau Desktop
Free Trial
No
No
Free/Freemium Version
No
Yes
Premium Consulting/Integration Services
No
Yes
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
All pricing plans are billed annually. A Creator license includes Tableau Desktop, Tableau Prep Builder, and Tableau Pulse. Discounts sometimes available for volume.
We tried different data tools and we figured we give RapidMinder Studio a shot as one of our employees had experience with it, and when compared to some of the other tools that we used it was the best fit among the test group that we used. Overall it was a little more fluid and …
The other product like RapidMiner Studio that I have used is WEKA. I decided to use RapidMiner because almost all modelling methods and feature selection methods from the Weka machine learning library are available within RapidMiner. Furthermore, RapidMiner Studio is a visual …
SPSS and SAS are too expensive. Their interfaces are excellent, but the price point is quite high making them inappropriate for higher education. KNIME is my second choice tool in this space, but it doesn't have the same long established english-speaking user community as …
The best part about RapidMiner is it mainly focus on machine learning algorithms whereas other tools focus on mainly the extract transform load (ETL) process. It can serve for all the KDD (Knowledge data discovery) process stages e.g. data cleaning, transformation, modeling and …
RapidMiner is really fantastic to perform fast ETL processes and work on your data as you want, no matter what is the source. You will really save a lot of time when you learn how to use it. You can create mining analysis with several algorithms, and thanks to add-ons, you can apply a lot of techniques. It will not replace a business intelligence dashboard but it allows to create great datamarts for your BI tools. One negative thing is that It's no easy to share your outputs.
The best scenario is definitely to collect data from several sources and create dedicated dashboards for specific recipients. However, I miss the possibility of explaining these reports in more detail. Sometimes, we order a report, and after half a year, we don't remember the meaning of some data (I know it's our fault as an organization, but the tool could force better practices).
I am very impressed at how easily you can work within RapidMiner without much data analytics training. Plus with the help of the crowd, you can see what steps others have taken with their data analytics projects.
Text mining was simple and clean. We used this for our call transcription problem where we didn't have the resources to listen to each call. We needed to qualify each call based on some key phrases.
Our direct mail program was large and not very targeted. Using RapidMiner, we were able to isolate a predictive level we felt comfortable with and decided not to send to anyone below that level. We saved quite a bit of money.
An excellent tool for data visualization, it presents information in an appealing visual format—an exceptional platform for storing and analyzing data in any size organization.
Through interactive parameters, it enables real-time interaction with the user and is easy to learn and get support from the community.
I hope RapidMiner would be the first data science platform that allows data scientists to change the behaviour of a machine learning algorithm that already exists in the repository. For example, I want to be able to change the way a genetic algorithm mutates.
Automatic programming: One day, I hope RapidMiner can automatically generate codes in any 4th generation programming language based on the developed model.
More tutorials/samples needed: Why doesn't RapidMiner becomes the next 'UC Irvine Machine Learning Repository'? Provide real examples and real cases for users to study and understand the best practices in modelling. RapidMiner already has some datasets for a tutorial. Besides the existing samples, I hope RapidMiner can provide more sample data and examples.
Our use of Tableau Desktop is still fairly low, and will continue over time. The only real concern is around cost of the licenses, and I have mentioned this to Tableau and fully expect the development of more sensible models for our industry. This will remove any impediment to expansion of our use.
Tableau Desktop has proven to be a lifesaver in many situations. Once we've completed the initial setup, it's simple to use. It has all of the features we need to quickly and efficiently synthesize our data. Tableau Desktop has advanced capabilities to improve our company's data structure and enable self-service for our employees.
When used as a stand-alone tool, Tableau Desktop has unlimited uptime, which is always nice. When used in conjunction with Tableau Server, this tool has as much uptime as your server admins are willing to give it. All in all, I've never had an issue with Tableau's availability.
Tableau Desktop's performance is solid. You can really dig into a large dataset in the form of a spreadsheet, and it exhibits similarly good performance when accessing a moderately sized Oracle database. I noticed that with Tableau Desktop 9.3, the performance using a spreadsheet started to slow around 75K rows by about 60 columns. This was easily remedied by creating an extract and pushing it to Tableau Server, where performance went to lightning fast
Tableau support has been extremely responsive and willing to help with all of our requests. They have assisted with creating advanced analysis and many different types of custom icons, data formatting, formulas, and actions embedded into graphs. Tableau offers a weekly presentation of features and assists with internal company projects.
It is admittedly hard to train a group of people with disparate levels of ability coming in, but the software is so easy to use that this is not a huge problem; anyone who can follow simple instructions can catch up pretty quickly.
I think the training was good overall, but it was maybe stating the obvious things that a tech savvy young engineer would be able to pick up themselves too. However, the example work books were good and Tableau web community has helped me with many problems
Again, training is the key and the company provides a lot of example videos that will help users discover use cases that will greatly assist their creation of original visualizations. As with any new software tool, productivity will decline for a period. In the case of Tableau, the decline period is short and the later gains are well worth it.
We tried different data tools and we figured we give RapidMinder Studio a shot as one of our employees had experience with it, and when compared to some of the other tools that we used it was the best fit among the test group that we used. Overall it was a little more fluid and user-friendly.
I have used Power BI as well, the pricing is better, and also training costs or certifications are not that high. Since there is python integration in Power BI where I can use data cleaning and visualizing libraries and also some machine learning models. I can import my python scripts and create a visualization on processed data.
Tableau Desktop's scaleability is really limited to the scale of your back-end data systems. If you want to pull down an extract and work quickly in-memory, in my application it scaled to a few tens of millions of rows using the in-memory engine. But it's really only limited by your back-end data store if you have or are willing to invest in an optimized SQL store or purpose-built query engine like Veritca or Netezza or something similar.
Thanks to the patters that RapidMiner has detected, we have been able to follow clues in the right direction, both for the Protein Interaction Network Analysis and for the Epilepsy Research
Students and participants of the machine learning workshops have learned about this technology and about the tool
Tableau was acquired years ago, and has provided good value with the content created.
Ongoing maintenance costs for the platform, both to maintain desktop and server licensing has made the continuing value questionable when compared to other offerings in the marketplace.
Users have largely been satisfied with the content, but not with the overall performance. This is due to a combination of factors including the performance of the Tableau engines as well as development deficiencies.