Skip to main content
TrustRadius
Apache Airflow

Apache Airflow

Overview

What is Apache Airflow?

Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL. Created at Airbnb as an open-source project in 2014, Airflow was brought into the Apache Software Foundation’s…

Read more
Recent Reviews

TrustRadius Insights

Apache Airflow has proven to be a versatile solution for managing and orchestrating various data tasks. Users have utilized this product …
Continue reading
Read all reviews

Awards

Products that are considered exceptional by their customers based on a variety of criteria win TrustRadius awards. Learn more about the types of TrustRadius awards to make the best purchase decision. More about TrustRadius Awards

Popular Features

View all 6 features
  • Multi-platform scheduling (9)
    8.8
    88%
  • Central monitoring (9)
    8.4
    84%
  • Logging (9)
    8.1
    81%
  • Alerts and notifications (9)
    7.9
    79%

Reviewer Pros & Cons

View all pros & cons
Return to navigation

Pricing

View all pricing
N/A
Unavailable

What is Apache Airflow?

Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL. Created at Airbnb as an open-source project in 2014, Airflow was brought into the Apache Software Foundation’s Incubator Program 2016 and announced as Top…

Entry-level set up fee?

  • No setup fee

Offerings

  • Free Trial
  • Free/Freemium Version
  • Premium Consulting/Integration Services

Would you like us to let the vendor know that you want pricing?

28 people also want pricing

Alternatives Pricing

N/A
Unavailable
What is Control-M?

Control-M from BMC is a platform for integrating, automating, and orchestrating application and data workflows in production across complex hybrid technology ecosystems. It provides deep operational capabilities, delivering speed, scale, security, and governance.

What is Superblocks?

Superblocks is an IDE for internal tooling – a programmable set of building blocks for developers to create mission-critical internal operational software. The Superblocks Application Builder to assemble flexible components and connect to databases and APIs. Users can create REST, GraphQL, and gPRC…

Return to navigation

Product Demos

Getting Started with Apache Airflow

YouTube

Apache Airflow | Build your custom operator for twitter API

YouTube
Return to navigation

Features

Workload Automation

Workload automation tools manage event-based scheduling and resource management across a wide variety of applications, databases and architectures

8.2
Avg 8.2
Return to navigation

Product Details

What is Apache Airflow?

Apache Airflow Video

What's coming in Airflow 2.0?

Apache Airflow Technical Details

Operating SystemsUnspecified
Mobile ApplicationNo

Frequently Asked Questions

Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL. Created at Airbnb as an open-source project in 2014, Airflow was brought into the Apache Software Foundation’s Incubator Program 2016 and announced as Top-Level Apache Project in 2019. It is used as a data orchestration solution, with over 140 integrations and community support.

Reviewers rate Multi-platform scheduling highest, with a score of 8.8.

The most common users of Apache Airflow are from Enterprises (1,001+ employees).
Return to navigation

Comparisons

View all alternatives
Return to navigation

Reviews and Ratings

(35)

Community Insights

TrustRadius Insights are summaries of user sentiment data from TrustRadius reviews and, when necessary, 3rd-party data sources. Have feedback on this content? Let us know!

Apache Airflow has proven to be a versatile solution for managing and orchestrating various data tasks. Users have utilized this product as a core component for scheduling and monitoring scheduled jobs, inspecting job successes and failures, and troubleshooting errors or failures. It has also been extensively employed in GCP as part of Cloud Composer for running ETL jobs, streamlining data pipelines, and creating workflows for analytics and reporting.

Reviewers have found Apache Airflow to be an easy-to-configure and setup solution, making it ideal for orchestrating data flows and building enterprise data pipelines. Its ability to integrate with third-party solutions via APIs allows for seamless data access and integration. Users have also appreciated the product's capability to manage ETL pipelines and programmatically monitor data pipelines.

Another valuable use case of Apache Airflow is its role in creating workflows, orchestrating data pipelines, and automating tasks. Its flexibility has been particularly beneficial when dealing with complex data pipelines from diverse sources. Furthermore, the product has been effective in performing data integration in AWS S3 region, connecting to relational databases, executing data extracts, and compiling them into multiple flat file segments.

Apache Airflow brings standardization and modularity to data pipelines, enabling the implementation of complex pipelines and facilitating the sharing of data with partners as well as scoring machine learning models. Overall, users have found this product to be a valuable tool for managing data tasks efficiently and effectively.

Based on user reviews, here are the most common recommendations for Apache Airflow:

  1. Read the documentation and take an introduction course to fully understand Airflow's behavior and close any knowledge gaps.

  2. Consider Airflow as a first choice for ETL tasks that require programming. However, keep in mind that the coding aspect may not be suitable for all ETL engineers.

  3. Replace cron jobs with Airflow for better results, utilizing its scheduling and dependency management features.

Overall, these recommendations emphasize the importance of familiarizing oneself with the documentation, leveraging Airflow's capabilities for programming-centric ETL tasks, and using it to replace traditional cron jobs.

Reviews

(1-9 of 9)
Companies can't remove reviews or game the system. Here's why
Alok Pabalkar | TrustRadius Reviewer
Score 7 out of 10
Vetted Review
Verified User
Incentivized
Very well suited for building ETL, Automated report generation as the workflow steps can be well defined and debugging is minimal. It can also be used for sending bulk email/sms/push notifications.

But when more complex workflows have to be implemented where the response of a task can create multiple branches and there are multiple feedback loops, the tool can become tedious.
Score 9 out of 10
Vetted Review
Verified User
Incentivized
I handle our pipeline scheduling and monitoring. I had minimal problems with Apache Airflow. It's well-suited for data engineers who are responsible for the creation of the data workflows. It is also best suited for the scheduling of the workflow; it allows us to execute Python scripts as well. Finally, Apache Airflow is best suited for the circumstances in which we need a scalable solution.
Score 7 out of 10
Vetted Review
Verified User
Incentivized
For a quick job scanning of status and deep-diving into job issues, details, and flows, AirFlow does a good job. No fuss, no muss. The low learning curve as the UI is very straightforward, and navigating it will be familiar after spending some time using it. Our requirements are pretty simple. Job scheduler, workflows, and monitoring. The jobs we run are >100, but still is a lot to review and troubleshoot when jobs don't run. So when managing large jobs, AirFlow dated UI can be a bit of a drawback.
Score 9 out of 10
Vetted Review
Verified User
Incentivized
Using Apache Airflow has been extremely helpful, as it means we can get to our endgame faster. This product has enabled us to translate our ideas into projects at a much faster speed than before we had this software. We manage data ingestion and modeling for multiple products and customers within each product. Each has its own pipeline with its own code.
Nick Waters | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
Incentivized
Well suited for anyone that wants to orchestrate data pipelines and workflows. Good for developing, scheduling, and monitoring data workflows and is capable of managing complex enterprise workloads and pipelines. The visual aspect of understanding how your workflows are inter-connected is especially useful.
Score 9 out of 10
Vetted Review
Verified User
Incentivized
Ease of use—you only need a little python knowledge to get started. Open-source community—Airflow is free and has a large community of active users. Apache Airflow is used for the scheduling and orchestration of data pipelines or workflows. Orchestration of data pipelines refers to the sequencing, coordination, scheduling, and managing of complex data pipelines from diverse sources.
Score 9 out of 10
Vetted Review
Verified User
Incentivized
Apache Airflow is best suited for the data engineers for creating the data workflows, and it is best suitable for the scheduling the workflow and also we can run the python codes as well using apache airflow, and it is suited for the situation where we need scalable solution. Monitoring can be done easily.
Return to navigation