Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL.
N/A
IBM StreamSets
Score 8.0 out of 10
N/A
IBM® StreamSets enables users to create and manage smart streaming data pipelines through a graphical interface, facilitating data integration across hybrid and multicloud environments. IBM StreamSets can support millions of data pipelines for analytics, applications and hybrid integration.
N/A
Pricing
Apache Airflow
IBM StreamSets
Editions & Modules
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
Apache Airflow
IBM StreamSets
Free Trial
No
No
Free/Freemium Version
Yes
No
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
—
More Pricing Information
Community Pulse
Apache Airflow
IBM StreamSets
Considered Both Products
Apache Airflow
No answer on this topic
IBM StreamSets
Verified User
Director
Chose IBM StreamSets
IBM StreamSets works well when compared to some of the other tools in the same category. They are easy to set up, development can be fast paced as the in-built / out of the box connectors that come along with the product.
Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
IBM StreamSets excels in real-time logistics data ingestion and transformation across hybrid systems. It’s less ideal for lightweight ETL tasks or static datasets where simpler tools can achieve similar results with less overhead and complexity.
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
IBM Stream sets has been a wonderful addition to our technology stack. It has helped in some of our initiatives such as data engineering, data integration for not only external customers but also for internal purposes. The tool has also helped on our use cases related to streaming data. Moving to another tool would require significant amount of work and time.
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
The StreamSets platform is very easy to use and the interface is extremely intuitive. The drag-and-drop, low-code design makes it accessible for teams with varying technical skills, allowing us to quickly connect sources, define transformations, and deploy pipelines without heavy coding. StreamSets allows us to get started quickly and not have to worry about our pipelines breaking once they're built.
Streamsets support has improved a lot in the last couple of years. We had some challenges in the beginning with support, but now the quality of the support and the responsiveness to tickets are better. We have contacted support multiple times when it came to scenarios where the system was slow or the output as not as we expected
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
First advantage is that this software is particularly new and it keeps updating according to the needs of the user. Other advantage is the it organises and produces conclusions on the basis of data without leaving any relevant information. Other softwares lack in data summarising and readability of the charts and sheets they produce.
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost