Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL.
N/A
Docker
Score 8.7 out of 10
N/A
Docker Enterprise was sold to Mirantis in 2019; that product is now sold as Mirantis Kubernetes Engine. But Docker now offers a 2-product suite that includes Docker Desktop, which they present as a fast way to containerize applications on a desktop; and, Docker Hub, a service for finding and sharing container images with a team and the Docker community, a repository of container images with an array of…
$5
per month
Pricing
Apache Airflow
Docker
Editions & Modules
No answers on this topic
Free
$0
unlimited public repositories
Pro
$5.00
per month per user
Team
$7.00
per month per user
Business
$21
per month per user
Offerings
Pricing Offerings
Apache Airflow
Docker
Free Trial
No
No
Free/Freemium Version
Yes
Yes
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
—
More Pricing Information
Community Pulse
Apache Airflow
Docker
Considered Both Products
Apache Airflow
Verified User
Consultant
Chose Apache Airflow
Step functions are only available in AWS but Apache Airflow provides cross cloud access. Apache Airflow also provides flexibility to pause, start and re-trigger dags. Provides executors where we can run in-house calculations if needed and which requires no integration with …
Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
You are going to be able to find the most resources and examples using Docker whenever you are working with a container orchestration software like Kubernetes. There will always some entropy when you run in a container, a containerized application will never be as purely performant as an app running directly on the OS. However, in most scenarios this loss will be negligible to the time saved in deployment, monitoring, etc.
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
I have been using Docker for more than 3 years and it really simplifies the modern application development and deployment. I like the ability of Docker to improve efficiency, portability and scalability for developers and operations teams. Another reason for giving this rating is because Docker integrates CI/CD pipelines very well
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
The reason why we are still using Docker right now is due to that is the best among its peers and suits our needs the best. However, the trend we foresee for the future might indicate Amazon lambda could potentially fit our needs to code enviornmentless in the near future.
It is the only tool in our toolset that has not [had] any issues so far. That is really a mark of reliability, and it's a testimony to how well the product is made, and a tool that does its job well is a tool well worth having. It is the base tool that I would say any organisation must have if they do scalable deployment.
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost