Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL.
N/A
Tidal by Redwood
Score 6.7 out of 10
N/A
Tidal Automation, from Redwood Software since the early 2023 acquisition, is an enterprise workload automation platform for automating and orchestrating cross-application, cross-platform workloads – in on-prem, cloud or hybrid environments – from one central point of control. Tidal is used to optimize mission-critical business processes, manage…
N/A
Pricing
Apache Airflow
Tidal by Redwood
Editions & Modules
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
Apache Airflow
Tidal by Redwood
Free Trial
No
No
Free/Freemium Version
Yes
No
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
—
More Pricing Information
Community Pulse
Apache Airflow
Tidal by Redwood
Features
Apache Airflow
Tidal by Redwood
Workload Automation
Comparison of Workload Automation features of Product A and Product B
Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
On coming to well suited experience while error handling and effortless recovery processes are crucial, Tidal performs exceptionally well. It can identify and fix automated issues, reducing downtime and interruptions. A smaller automation solution may be more cost-effective if a business primarily utilizes a single platform or uses a small number of applications that do not require complex integration.
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
Tidal Automation allows us to automate and schedule/ perform various tasks in a easy and effective manner. It is highly interactive and effective allowing nearly 4k -5k jobs to run a day.
Tidal is designed to be easily shareable and collaborative allowing multiple users to work at a particular time making work effective.
Tidal allows us to ensure that every event which is triggered is exactly when it's supposed to be regardless of other activities which are going as well.
It is user-friendly to generate the reports required of a particular object and allows us to test the codes perfectly well before executing in live environment.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
Still a bit slow when navigating. If you close a job you have to wait a few seconds to open another one. Even when you made no changes.
When viewing a job and make no changes, the "ok" button changes the last modified date as if you made a change. No big deal, but wastes time when troubleshooting a problem and looking into what jobs were changed last.
You can see the parameters column in the "job activity", but not in "Job definitions".
Can't search the parameters field in the filter.
Changing a variable name does not change it on the job. It still works because Tidal Automation uses the ID number. It just causes confusion when you see a variable on a job and can't find the variable under "Variables". On top of that, Tidal Automation does not show the ID column under "Variables" making it even more difficult to find the variable.
We are on the fence. The increased pricing for renewals is staggering. With new automation options like Microsoft's Power Automate and Event Driven Ansible on the field, there are other options now available.
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
Having provided consulting services for years on Tidal by Redwood, I recommend going with a solutions partner or consultant to deploy it. I believe there are sizing and tuning guidelines that should be followed for environments of scale. I believe they are not critical when first lighting up the product, but if you are not aware of them you will encounter performance degradation after a few thousand job objects are added.
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
1. Tidal is good at processing large volume pf data and is cost effective. 2. Tidal can automate the scheduling of production objects, ensure that materials are delivered on time 3. Tidal Process large volumes of data which cannot be done everyday by running codes/scripts manually which does it with ease when required.
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost