Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
Previously, our team used Jenkins. However, since it's a shared deployment resource we don't have admin access. We tried GoCD as it's open source and we really like. We set up our deployment pipeline to run whenever codes are merged to master, run the unit test and revert back if it doesn't pass. Once it's deployed to the staging environment, we can simply do 1-click to deploy the appropriate version to production. We use this to deploy to an on-prem server and also AWS. Some deployment pipelines use custom Powershell script for.Net application, some others use Bash script to execute the docker push and cloud formation template to build elastic beanstalk.
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
Pipeline-as-Code works really well. All our pipelines are defined in yml files, which are checked into SCM.
The ability to link multiple pipelines together is really cool. Later pipelines can declare a dependency to pick up the build artifacts of earlier ones.
Agents definition is really great. We can define multiple different kinds of environments to best suit our diverse build systems.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
GoCD is easier to setup, but harder to customize at runtime. There's no way to trigger a pipeline with custom parameters.
Jenkins is more flexible at runtime. You can define multiple user-provided parameters so when user needs to trigger a build, there's a form for him/her to input the parameters.
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost
Settings.xml need to be backed up periodically. It contains all the settings for your pipelines! We accidentally deleted before and we have to restore and re-create several missing pipelines
More straight forward use of API and allows filtering e.g., pull all pipelines triggered after this date