Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL.
N/A
Fivetran
Score 8.4 out of 10
N/A
Fivetran replicates applications, databases, events and files into a high-performance data warehouse, after a five minute setup. The vendor says their standardized cloud pipelines are fully managed and zero-maintenance. The vendor says Fivetran began with a realization: For modern companies using cloud-based software and storage, traditional ETL tools badly underperformed, and the complicated configurations they required often led to project failures. To streamline and accelerate…
$0.01
Striim
Score 8.3 out of 10
N/A
Striim is an enterprise-grade platform that offers continuous real-time data ingestion, high-speed in-flight stream processing, and sub-second delivery of data to cloud and on-premises endpoints.
$4,400
per month per 100 million Striim events
Pricing
Apache Airflow
Fivetran
Striim
Editions & Modules
No answers on this topic
Starter
$0.01
per credit
Standard
$0.01
per credit
Enterprise
$0.01
per credit
Striim Cloud Enterprise Platform
$4,400
per month per 100 million Striim events
Striim Platform
$20,000
per year per VCPU
Offerings
Pricing Offerings
Apache Airflow
Fivetran
Striim
Free Trial
No
Yes
Yes
Free/Freemium Version
Yes
No
No
Premium Consulting/Integration Services
No
No
No
Entry-level Setup Fee
No setup fee
Optional
No setup fee
Additional Details
—
—
—
More Pricing Information
Community Pulse
Apache Airflow
Fivetran
Striim
Features
Apache Airflow
Fivetran
Striim
Workload Automation
Comparison of Workload Automation features of Product A and Product B
Apache Airflow
8.7
12 Ratings
5% above category average
Fivetran
-
Ratings
Striim
-
Ratings
Multi-platform scheduling
9.312 Ratings
00 Ratings
00 Ratings
Central monitoring
8.912 Ratings
00 Ratings
00 Ratings
Logging
8.512 Ratings
00 Ratings
00 Ratings
Alerts and notifications
9.312 Ratings
00 Ratings
00 Ratings
Analysis and visualization
6.712 Ratings
00 Ratings
00 Ratings
Application integration
9.412 Ratings
00 Ratings
00 Ratings
Data Source Connection
Comparison of Data Source Connection features of Product A and Product B
Apache Airflow
-
Ratings
Fivetran
10.0
8 Ratings
19% above category average
Striim
10.0
2 Ratings
19% above category average
Connect to traditional data sources
00 Ratings
10.08 Ratings
10.02 Ratings
Connecto to Big Data and NoSQL
00 Ratings
10.06 Ratings
10.01 Ratings
Data Transformations
Comparison of Data Transformations features of Product A and Product B
Apache Airflow
-
Ratings
Fivetran
7.3
7 Ratings
10% below category average
Striim
8.9
2 Ratings
10% above category average
Simple transformations
00 Ratings
7.47 Ratings
10.02 Ratings
Complex transformations
00 Ratings
7.15 Ratings
7.82 Ratings
Data Modeling
Comparison of Data Modeling features of Product A and Product B
Apache Airflow
-
Ratings
Fivetran
6.2
8 Ratings
23% below category average
Striim
8.5
2 Ratings
9% above category average
Data model creation
00 Ratings
2.06 Ratings
8.42 Ratings
Metadata management
00 Ratings
4.04 Ratings
8.42 Ratings
Business rules and workflow
00 Ratings
8.06 Ratings
9.01 Ratings
Collaboration
00 Ratings
7.85 Ratings
7.82 Ratings
Testing and debugging
00 Ratings
9.04 Ratings
7.22 Ratings
Data Governance
Comparison of Data Governance features of Product A and Product B
Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
Fivetran's business model justifies the use-case where we require data from a single source basically a lot of data but if the requirement is not on the heavier side, Fivetran comes to costly operation when compared to its peers. Otherwise, I'll recommend Fivetran for stability and update and seamless service provider.
Below samples are the well suited use cases; - Change data capture feature seamlessly works on popular RDMS. You can make enrichments on several data sources within the same Striim application. - You can install stand alone agents and start streaming log files to Striim servers. This is mainly useful for security operations or audit trail use cases.
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
Very easy and intuitive to setup and maintain as there usually are not that many options. Very well documented (e.g. how to setup each connector, how the schema looks like, any specific features of this connector etc.). Also the operation is intuitive, e.g. you have status pages, log pages, configuration pages etc. for each connector.
It runs pretty well and gets our data from point A to point cluster quickly enough. Honestly, it's not something I think about unless it breaks and that's pretty rare.
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
We never seriously considered using anything else. Our data engineers had used Fivetran extensively in previous roles so when it came time to make a decision, there wasn't much of a process. They gladly signed the contract with Fivetran pretty quickly.
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost