Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL.
N/A
Informatica Intelligent Data Management Cloud
Score 7.0 out of 10
N/A
The Informatica® Intelligent Data Management Cloud™ (IDMC) is designed to help businesses efficiently handle the complex challenges of dispersed and fragmented data to innovate with their data on virtually any platform, any cloud, multi-cloud and multi-hybrid.
N/A
Pricing
Apache Airflow
Informatica Intelligent Data Management Cloud
Editions & Modules
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
Apache Airflow
Informatica Intelligent Data Management Cloud
Free Trial
No
No
Free/Freemium Version
Yes
No
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
—
More Pricing Information
Community Pulse
Apache Airflow
Informatica Intelligent Data Management Cloud
Features
Apache Airflow
Informatica Intelligent Data Management Cloud
Workload Automation
Comparison of Workload Automation features of Product A and Product B
Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
Informatica Cloud is a great tool for use when data must be formatted consistently. Once configured, it is very robust and reliable. It is also well-suited for an organization without a robust IT staff to maintain a full server infrastructure. It offers a cost-effective approach to high-quality data integration for even the largest organizations. Organizations without staff experienced in data analytics may find it challenging to take advantage of the more complex results of this tool.
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
Once the secure connection is established it’s quite easy to operate and create new jobs. The controls are simple, and we appreciate the fact there are not a lot of complex fine-tunings required. Navigation is also easy, and we enjoy the ability to open multiple tabs in the browser to work on multiple projects.
The monitoring functionality works well to help track the progress of the jobs, again, without too much complication. In a fast dev environment, speed is essential and we quickly seeing the status/progress of jobs as well as any errors if the jobs fail helps us maintain speed.
The web interface is a lot easier to interact with than the client/on-prem version. Putting much of the heavy lifting of interacting with the tool onto the shoulders of the browser makes it easier to keep multiple sessions open and get in/out quickly without having to VPN into the office.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
I've never had trouble getting into contact with Informatica's support for technical help. I give it a nine because it does pretty well for mid to enterprise-scale workflows.
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
First, the wizard is easy to use making the learning curve for simple ETL tasks nice. Second, since Informatica is mature there are a good variety of connectors available. Finally, we have driven some fairly complex ETL solutions using only the cloud.
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost