Skip to main content
TrustRadius
Apache Airflow

Apache Airflow

Overview

What is Apache Airflow?

Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL. Created at Airbnb as an open-source project in 2014, Airflow was brought into the Apache Software Foundation’s…

Read more
Recent Reviews

TrustRadius Insights

Apache Airflow has proven to be a versatile solution for managing and orchestrating various data tasks. Users have utilized this product …
Continue reading
Read all reviews

Awards

Products that are considered exceptional by their customers based on a variety of criteria win TrustRadius awards. Learn more about the types of TrustRadius awards to make the best purchase decision. More about TrustRadius Awards

Popular Features

View all 6 features
  • Multi-platform scheduling (9)
    8.8
    88%
  • Central monitoring (9)
    8.4
    84%
  • Logging (9)
    8.0
    80%
  • Alerts and notifications (9)
    7.9
    79%

Reviewer Pros & Cons

View all pros & cons
Return to navigation

Pricing

View all pricing
N/A
Unavailable

What is Apache Airflow?

Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL. Created at Airbnb as an open-source project in 2014, Airflow was brought into the Apache Software Foundation’s Incubator Program 2016 and announced as Top…

Entry-level set up fee?

  • No setup fee

Offerings

  • Free Trial
  • Free/Freemium Version
  • Premium Consulting/Integration Services

Would you like us to let the vendor know that you want pricing?

27 people also want pricing

Alternatives Pricing

What is Superblocks?

Superblocks is an IDE for internal tooling – a programmable set of building blocks for developers to create mission-critical internal operational software. The Superblocks Application Builder to assemble flexible components and connect to databases and APIs. Users can create REST, GraphQL, and gPRC…

What is Lightning Tools Actions?

Lightning Tools Actions is a complement to Lightning Forms, but can also be used independently of Lightning Forms. Rather than opening individual items to update fields such as Approval Status, Resetting field values such as holiday entitlement, or sending emails to notify creators of items that a…

Return to navigation

Product Demos

Getting Started with Apache Airflow

YouTube

Apache Airflow | Build your custom operator for twitter API

YouTube
Return to navigation

Features

Workload Automation

Workload automation tools manage event-based scheduling and resource management across a wide variety of applications, databases and architectures

8.2
Avg 8.2
Return to navigation

Product Details

What is Apache Airflow?

Apache Airflow Video

What's coming in Airflow 2.0?

Apache Airflow Technical Details

Operating SystemsUnspecified
Mobile ApplicationNo

Frequently Asked Questions

Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL. Created at Airbnb as an open-source project in 2014, Airflow was brought into the Apache Software Foundation’s Incubator Program 2016 and announced as Top-Level Apache Project in 2019. It is used as a data orchestration solution, with over 140 integrations and community support.

Reviewers rate Multi-platform scheduling highest, with a score of 8.8.

The most common users of Apache Airflow are from Enterprises (1,001+ employees).
Return to navigation

Comparisons

View all alternatives
Return to navigation

Reviews and Ratings

(33)
Loading...

Reviews

(1-9 of 9)
Companies can't remove reviews or game the system. Here's why
Alok Pabalkar | TrustRadius Reviewer
Score 6 out of 10
Vetted Review
Verified User
Incentivized
Used Airflow for Analytics & Reporting
  • Reports
  • Sending Bulk Email/Notification
  • Processing from different data sources
  • Improve the GUI Control Panel
  • Provide more example and documentation
  • Improvement in debugging
Very well suited for building ETL, Automated report generation as the workflow steps can be well defined and debugging is minimal. It can also be used for sending bulk email/sms/push notifications.

But when more complex workflows have to be implemented where the response of a task can create multiple branches and there are multiple feedback loops, the tool can become tedious.
Score 9 out of 10
Vetted Review
Verified User
Incentivized
We use Apache Airflow to streamline the data pipelines, create workflows according to the needs of the project and overall monitoring of the functionality itself. In addition, we are using Apache Airflow to solve the problem of retrieving data from Hive before creating the workflow in its entirety. It's also utilized for automation.
  • In charge of the ETL processes.
  • As there is no incoming or outgoing data, we may handle the scheduling of tasks as code and avoid the requirement for monitoring.
  • There is no way to assess the processes because they do not keep the metadata.
  • Python is currently the only language supported for creating programmed pipelines.
  • They need to implement both event-based and time-based scheduling.
I handle our pipeline scheduling and monitoring. I had minimal problems with Apache Airflow. It's well-suited for data engineers who are responsible for the creation of the data workflows. It is also best suited for the scheduling of the workflow; it allows us to execute Python scripts as well. Finally, Apache Airflow is best suited for the circumstances in which we need a scalable solution.
Score 7 out of 10
Vetted Review
Verified User
Incentivized
We use apache airflow as part of our DAG scheduler and health monitoring tool. It serves as a core component in ensuring our scheduled jobs are run, the ability to allow us to inspect jobs successes and failures, and as a troubleshooting tool in an event of job errors/failures. It has been a core tool and we are happy with what it does.
  • Job scheduling - Pretty straightforward in terms of UI.
  • Job monitoring - Dashboard is as straightforward as it gets.
  • Troubleshooting jobs - ability to dive into detailed errors and navigate the job workflow.
  • UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
  • Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
  • Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
For a quick job scanning of status and deep-diving into job issues, details, and flows, AirFlow does a good job. No fuss, no muss. The low learning curve as the UI is very straightforward, and navigating it will be familiar after spending some time using it. Our requirements are pretty simple. Job scheduler, workflows, and monitoring. The jobs we run are >100, but still is a lot to review and troubleshoot when jobs don't run. So when managing large jobs, AirFlow dated UI can be a bit of a drawback.
Score 9 out of 10
Vetted Review
Verified User
Incentivized
We use Apache Airflow to perform data integration in AWS S3 region. With this we are able to connect to a relational database, easily execute data extracts, and compile them all in multiple flat file segments. Airflow brings a lot of standardization as well as modularity. We also use it to send data to partners and score ML models. It allows us to implement complex data pipelines easily.
  • Multiple helpful features
  • Very intuitive flow charts
  • Reruns and backfills are very easy
  • SLA and DAGs are easy to set up
  • Potentially a steep learning curve
  • The browser UI could do with a few enhancements
Using Apache Airflow has been extremely helpful, as it means we can get to our endgame faster. This product has enabled us to translate our ideas into projects at a much faster speed than before we had this software. We manage data ingestion and modeling for multiple products and customers within each product. Each has its own pipeline with its own code.
Nick Waters | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
Incentivized
Apache airflow is a great way to orchestrate workflows and build enterprise data pipelines. It is very easy to configure and setup and would be my go to solution for orchestrating data flows. We use Airflow to integrate our solution via APIs and allow third party solutions to access our solution and data held within in it.
  • Orchestrate workflows
  • Visualise workflows easily using DAG
  • Integrate 3rd party data sources
  • Visualisation UI could be improved in my opinion.
  • Enterprise features
  • Performance improvements in bigger deployments.
Well suited for anyone that wants to orchestrate data pipelines and workflows. Good for developing, scheduling, and monitoring data workflows and is capable of managing complex enterprise workloads and pipelines. The visual aspect of understanding how your workflows are inter-connected is especially useful.
Score 9 out of 10
Vetted Review
Verified User
Incentivized
Apache Airflow is used for the scheduling and orchestration of data pipelines or workflows. Orchestration of data pipelines refers to the sequencing, coordination, scheduling, and managing of complex data pipelines from diverse sources. It is also helpful when your data pipelines change slowly (days or weeks – not hours or minutes), are related to a specific time interval, or are pre-scheduled.
  • Scheduling of data pipelines or workflows.
  • Orchestration of data pipelines or workflows.
  • Not intuitive for new users.
  • Setting up Airflow architecture for production is NOT easy.
Ease of use—you only need a little python knowledge to get started. Open-source community—Airflow is free and has a large community of active users. Apache Airflow is used for the scheduling and orchestration of data pipelines or workflows. Orchestration of data pipelines refers to the sequencing, coordination, scheduling, and managing of complex data pipelines from diverse sources.
Score 7 out of 10
Vetted Review
Verified User
Incentivized
We use apache Airflow in GCP as part of Cloud Composer to run all our ETL jobs.
  • schedule jobs
  • graphing job flow and dependencies and retries
  • Nice UI for visualization
  • Instead of using a Storage bucket as a source, will be nice if the DAGs can be pulled by a private git repo directly
  • Upgrade process could be smoother
If you are using GCP, you can use Apache Airflow very easily by using Cloud Composer which is the managed service for Airflow. If you need to deploy it yourself, installation and setup could be tricky.
Score 9 out of 10
Vetted Review
Verified User
Incentivized
We are using Apache Airflow for streamline the data pipelines, creating the workflow, Schedule the workflow as per the need, and also monitor the same, we are solving the problem of fetching the data from hive and then created the complete workflow and also we are using for automation as well.
  • Smart Automation
  • Highly Scalable
  • Complex Workflow
  • Easy Integration with other system
  • Documentation part
  • GUI can be improved
  • Reliability issues
Apache Airflow is best suited for the data engineers for creating the data workflows, and it is best suitable for the scheduling the workflow and also we can run the python codes as well using apache airflow, and it is suited for the situation where we need scalable solution. Monitoring can be done easily.
April 04, 2022

Apache Airflow

PRABHAT MISHRA | TrustRadius Reviewer
Score 8 out of 10
Vetted Review
Verified User
Incentivized
We are using apache airflow for managing the ETL pipelines. We are using programmatically to monitor the data pipeline. I have been helping the data team in creating the pipeline using apache airflow.
  • We are using for the workflow management system
  • managing the etl pipelines.
  • We can manage the task scheduling as code & need not monitor as there is no data in & out.
  • they should bring in some time based scheduling too not only event based
  • they do not store the metadata due to which we are not able to analyze the workflows
  • they only support python as of now for scripted pipeline writing
We were using it for managing the workflows for the etl pipelines as code so Airflow came as very helpful.
Return to navigation