Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL. Created at Airbnb as an open-source project in 2014, Airflow was brought into the Apache Software Foundation’s Incubator Program 2016 and announced as Top-Level Apache Project in 2019. It is used as a data orchestration solution, with over 140 integrations and community support.
N/A
Pentaho
Score 5.1 out of 10
N/A
Pentaho is a suite of open source business intelligence and analytics products, now offered and supported by Hitachi Data Systems since the June 2015 acquisition.
N/A
Pricing
Apache Airflow
Pentaho
Editions & Modules
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
Apache Airflow
Pentaho
Free Trial
No
No
Free/Freemium Version
Yes
No
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
—
More Pricing Information
Community Pulse
Apache Airflow
Pentaho
Features
Apache Airflow
Pentaho
Workload Automation
Comparison of Workload Automation features of Product A and Product B
Apache Airflow
8.8
12 Ratings
5% above category average
Pentaho
-
Ratings
Multi-platform scheduling
9.312 Ratings
00 Ratings
Central monitoring
9.012 Ratings
00 Ratings
Logging
8.612 Ratings
00 Ratings
Alerts and notifications
9.312 Ratings
00 Ratings
Analysis and visualization
6.912 Ratings
00 Ratings
Application integration
9.312 Ratings
00 Ratings
BI Standard Reporting
Comparison of BI Standard Reporting features of Product A and Product B
Apache Airflow
-
Ratings
Pentaho
9.0
20 Ratings
10% above category average
Pixel Perfect reports
00 Ratings
8.618 Ratings
Customizable dashboards
00 Ratings
9.918 Ratings
Report Formatting Templates
00 Ratings
8.718 Ratings
Ad-hoc Reporting
Comparison of Ad-hoc Reporting features of Product A and Product B
Apache Airflow
-
Ratings
Pentaho
8.7
19 Ratings
8% above category average
Drill-down analysis
00 Ratings
7.618 Ratings
Formatting capabilities
00 Ratings
8.319 Ratings
Integration with R or other statistical packages
00 Ratings
9.312 Ratings
Report sharing and collaboration
00 Ratings
9.717 Ratings
Report Output and Scheduling
Comparison of Report Output and Scheduling features of Product A and Product B
Apache Airflow
-
Ratings
Pentaho
9.7
20 Ratings
17% above category average
Publish to Web
00 Ratings
9.618 Ratings
Publish to PDF
00 Ratings
9.819 Ratings
Report Versioning
00 Ratings
9.713 Ratings
Report Delivery Scheduling
00 Ratings
9.917 Ratings
Delivery to Remote Servers
00 Ratings
9.310 Ratings
Data Discovery and Visualization
Comparison of Data Discovery and Visualization features of Product A and Product B
Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
Pentaho is very well suited to perform data extraction & data mining from various cloud storage & transform that data using various available data models. However, the software struggles when it comes to visualizing the extracted data in an appealing manner & can be difficult for end-users to get an understanding of data tables created using those models.
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
I think the relative obscurity of the tool is a downside, not as many developers, consultants or peers you can tap into.
Lack of a solid user community held us back, looking at Power BI and Qlik, they have huge user communities that help each other out. Would have liked that here.
Smaller company means smaller sales force, and the lack of a local presence made it hard to only interact online with the account rep. Other companies have someone local who often stops by with pre-sales developers to just pitch in free of charge when they have time.
I will use Pentaho until I find a better tool with a better, easier to use report designer client. For now, Pentaho has been the most powerful reporting tool for our clients because of its ability to connect to Odoo, integrate in Odoo (reports are accessible in Odoo) and the flexibility in report design and parameter integration
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
The Pentaho tools are designed so you can start playing around on your own. Of course, you will need guidance at some point, but the training teams are good at guiding new users, and the online documentation is usually pretty up-to-date.
Some of the tools, such as the Pentaho Data Integration tool and the Pentaho Server, are pretty self-explanatory. The other tools maybe are not so quickly and obvious to use, but again, with some documentation and some customer support, you can find your way around them.
They were responsive to our questions when we raised issues. They gave us workarounds when required. They were quite knowledgeable when it came to issue analysis and providing fixes. They were forthright in informing us if a bug was not due for release soon.
Course Taken: DI1000 Pentaho Data Integration Fundamentals Setup A week before your class started, the instructor will start sending out class material and lab setup instructions. This is helpful so that you understand how the environment is laid out and can start reviewing the content. Ultimately it saved about a 1/2 day trying to setup with 10 other people online which was great! The Course The 3-day course was laid out like many other technical classes with 15-30 minutes instruction and 15-60 minutes of lab exercises. The instructor was very knowledgeable with the functionality from version to version and answered questions as we went along. I was amazed at some of the functionality that was available that I was not using at the time and quickly implemented changes to many existing transformations and jobs. The novice users seemed to catch on quickly and more experienced users explained how some of the functionality was used in their home environments. Towards the end there was enough time so that we were able to ask very directed questions about our own environments. Overall, I really found the class to be informative and deliver enough information to be dangerous. My skills improved and I was able to design better and efficient transformations for the HIE. Course Description: https://training.pentaho.com/instructor-led-training/pentaho-data-integration-fundamentals-di1000
Get the right people in before starting implementation. Start small and build as you go approach is time consuming and involves lot of rework. Evangalize within the organization the capabilities and limitations equally so that correct delivery expectations are set. Set expectations with the Customer that the tool cannot replace proprietary software in terms of stability/usability and that timelines could change given the new ness of the product.
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
Since the Pentaho platform offers a range of broad functionality across data preparation and advanced analytics, it also can be easily integrated to support many data sources and machine-learning frameworks. Based on that fact, we selected Pentaho to be used in our internal department. It also supports many of our BI use cases as required by company management or the business user. Last but not least, the Pentaho license is cheaper than their competitor.
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost