Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL.
N/A
SAP Data Intelligence
Score 8.7 out of 10
N/A
SAP Data Intelligence is presented by the vendor as a single solution to innovate with data. It provides data-driven innovation in the cloud, on premise, and through BYOL deployments. It is described by the vendor as the new evolution of the company's data orchestration and management solution running on Kubernetes, released by SAP in 2017 to deal with big data and complex data orchestration working across distributed landscapes and processing engine.
N/A
Pricing
Apache Airflow
SAP Data Intelligence
Editions & Modules
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
Apache Airflow
SAP Data Intelligence
Free Trial
No
Yes
Free/Freemium Version
Yes
No
Premium Consulting/Integration Services
No
Yes
Entry-level Setup Fee
No setup fee
Optional
Additional Details
—
—
More Pricing Information
Community Pulse
Apache Airflow
SAP Data Intelligence
Features
Apache Airflow
SAP Data Intelligence
Workload Automation
Comparison of Workload Automation features of Product A and Product B
Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
If you have an SAP products ecosystem in your IT landscape, it becomes a no-brainer to go ahead with an SAP Data Intelligence product for your data orchestration, data management, and advanced data analytics needs, such as data preparation for your AI/ML processes. It provides a seamless integration with other SAP products.
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
Data transfer speed tends to be slow when there is poor internet connection since SAP Data Intelligence don’t synchronize data while offline. However, this is not vendor fault, that’s why we have implemented robust wireless technology internet connection in our organization.
Allow collaborations among various personas with insights as ratings and comments on the datasets Reuse knowledges on the datasets for new users Next-Gen Data Management and Artificial Intelligence
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
I think the troubleshooting process might be streamlined with improved error recording and tracing. A lot of information about issues and how to fix them is hidden away in the Kubernetes pods themselves. I'm not sure whether SAP Data Intelligence can fix this problem it may be connected to Kubernetes's design, in which case fixing it could need modifications inside Kubernetes itself.
Initially we struggle to get help from SAP but then dedicated Dev angel was assigned to us and that simplify the overall support scenario. There is still room of improvement in documentation around SAP Data intelligence. We struggle a lot to initially understand the feature and required help around performance improvement area,
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
One of the reasons to pick SAP Data Intelligence is the speed and security it provides, in addition to the excellent support it provides. It is also compatible with all popular databases, which is another reason to choose it.
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost