Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL. Created at Airbnb as an open-source project in 2014, Airflow was brought into the Apache Software Foundation’s Incubator Program 2016 and announced as Top-Level Apache Project in 2019. It is used as a data orchestration solution, with over 140 integrations and community support.
N/A
VMware Cloud Director
Score 8.7 out of 10
N/A
VMware Cloud Director (formerly vCloud Director) is a cloud service-delivery platform used by cloud providers to operate and manage cloud-service businesses. The vendor states that by using VMware Cloud Director, cloud providers deliver secure, efficient, and elastic cloud resources to thousands of enterprises and IT teams across the world.
N/A
Pricing
Apache Airflow
VMware Cloud Director
Editions & Modules
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
Apache Airflow
VMware Cloud Director
Free Trial
No
No
Free/Freemium Version
Yes
No
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
—
More Pricing Information
Community Pulse
Apache Airflow
VMware Cloud Director
Features
Apache Airflow
VMware Cloud Director
Workload Automation
Comparison of Workload Automation features of Product A and Product B
Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
If you have a lot of customers that all need to have a separate place to work in, without the possibility of getting in each other way, and you want to safe yourself a lot of work. Than I strongly recommend you Cloud director. Ofcourse, only if you have a VMware environment as your working environment. If you just have a small group of customers and you can easily handle the work that's coming from it, then it is overkill to add cloud director to your environment. In a later station, you can always introduce cloud director (so tis never to late if you still want to use it)
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
The add-on/extension required on the internet browser sometimes are difficult to get working at first. We've experience instances where the add-on/extension latest versions will not work and have to downgrade to an older version.
The server console lacks features and tools. For example it would be useful to have a copy and paste tool or a file upload tool.
The vCloud Director management site uses Adobe Flash, which makes it impossible to use on a mobile device.
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
vCloud Director is definitely my favorite as far as cloud managers. The only thing that compares is Cisco UCS Director, but it has slightly different functionality and purpose. I understand why a lot of clients still go with vCloud Director even though VMware intends to sunset it
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost