Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL. Created at Airbnb as an open-source project in 2014, Airflow was brought into the Apache Software Foundation’s Incubator Program 2016 and announced as Top-Level Apache Project in 2019. It is used as a data orchestration solution, with over 140 integrations and community support.
N/A
ignio AIOps
Score 8.1 out of 10
N/A
ignio AIOps, from Digitate in Santa Clara, is a solution designed to improve business agility by creating a unified view of the IT estate, connecting business functions to applications and infrastructure. This is combined with behavior profile of systems and applications that is continuously learnt using this blueprint. ignio aims to improve the transparency of complex Enterprise IT landscapes.
N/A
Pricing
Apache Airflow
ignio AIOps
Editions & Modules
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
Apache Airflow
ignio AIOps
Free Trial
No
No
Free/Freemium Version
Yes
No
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
—
More Pricing Information
Community Pulse
Apache Airflow
ignio AIOps
Features
Apache Airflow
ignio AIOps
Workload Automation
Comparison of Workload Automation features of Product A and Product B
Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
It's good for issue resolution, user access request automation, standard report generation, health checks, executing self-healing as configured in the attributes. Currently not good at real-time monitoring to trigger an action. Health checks have to be on a scheduled basis.
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
There is a lot more the desktop tool can do. For example, we need to apply an upgrade to get the tool to talk to our infrastructure while employees are working from home. The tool was initially installed with the assumption that the desktops would be in UserLand. Instead after COVID-19 the desktop/laptops have been used for over a year on people's home networks. As of right now, we have to sync when the devices are connected to VPN. Moving forward with the upgrade, we will be getting this data over TLS when they are connected to the untrusted networks.
The concept of ignio AlOps requires OCM efforts within most operational teams. This isn't necessarily the fault of the tool itself, but when implementing ignio, or any AIOps tool, the team will get a lot of pushback as an outside team is centralizing the operational improvements. The tool should have a centralized intake process that will allow the collection, ranking, and management of automation opportunities. ignio AlOps should then simulate the proposed efficiencies from implementing something within the backlog. Right now a lot of local teams are having a hard time getting on the same page as the enterprise teams, and a common methodology for prioritizing (even if overly simplistic) would go a long way to enterprise planning.
These tools are very new and things get added to them all the time. There should be a way for the product's stakeholders and process owners to understand the additional value ignio AlOps is gaining over time.
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
ignio AIOps version upgrades were a heavy lift. Having to learn a new language versus an industry standard language took time. More consideration on overall internal long-term support needs to be determined.
We have built a healthy relationship with the vendor support team throughout the implementation phase, all incidents raised were resolved within the SLA without a fail
I am happy with the way team has implemented and shared the product for our organization. However, would like to see it get extended to the other line of business too.
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost