Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL.
N/A
Snow Atlas
Score 8.0 out of 10
N/A
Snow Atlas is a cloud-native platform built from the ground up to provide Technology Intelligence for today’s hybrid enterprises. Based on a microservices architecture and standardized APIs, Snow Atlas provides a unified foundation for Snow’s IT asset management, SaaS management and FinOps solutions. It can be used to display all of the technology in an enterprise's IT stack, or to find opportunities to enhance, optimize and efficiently manage technology assets and share data with…
N/A
Pricing
Apache Airflow
Snow Atlas
Editions & Modules
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
Apache Airflow
Snow Atlas
Free Trial
No
No
Free/Freemium Version
Yes
No
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
—
More Pricing Information
Community Pulse
Apache Airflow
Snow Atlas
Features
Apache Airflow
Snow Atlas
Workload Automation
Comparison of Workload Automation features of Product A and Product B
Apache Airflow
8.7
12 Ratings
5% above category average
Snow Atlas
-
Ratings
Multi-platform scheduling
9.312 Ratings
00 Ratings
Central monitoring
8.912 Ratings
00 Ratings
Logging
8.612 Ratings
00 Ratings
Alerts and notifications
9.312 Ratings
00 Ratings
Analysis and visualization
6.712 Ratings
00 Ratings
Application integration
9.412 Ratings
00 Ratings
IT Asset Management
Comparison of IT Asset Management features of Product A and Product B
Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
SaaS connectors are not always kept up to date usually when Publishers make changes to their Portal API's. Appears to be little active monitoring on Flexera/Snow Atlas' side unless a customer reports an issue with the data being returned. Fixes are normally implemented as as quickly as possible, depending on whether it is considered a Bug Fix or a Feature Enhancement.
Users - Snow on SAM - No ability to add or bulk import manually. Completely reliant on AD Discovery or Entra ID Discovery
Users - SaaS module - No ability for bulk update of Users for things line 'Online only' or 'Qualified' user accounts. This is an issue in larger companies where you have thousands of SaaS Users being reported through connectors like Microsoft E365.
SaaS module Dashboard does not allow for filtering of insights to a specific Publisher.
Not all Back end SMACC functionality form Snow License Manager have been exposed to the front-end access, as Snow Atlas does not allow customer Administrators access to the back end or SQL databases.
If you are migrating from on-prem Snow License Manager to Atlas, migration tools have not been created by Snow and will require a Project to handle your migration. Without Migration tools, we had to use a Managed Service Partner who had to manually create a lot of their own scripts to retrieve data that cannot be downloaded via reports and imported into Atlas. Any attachments documentation on Agreement or License Records has to be manually re-attached/uploaded to the relevant Agreement/License records in Atlas as the migration was performed.
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
Front line support staff done always understand the issue you are explaining or the need to escalate to back end/higher up areas for resolutions and can often require use of the Escalate function or emailing to your Account/Customer Success Manager. That said, once an issue properly is understood, it is handled well.
We should have spun up a Project to manage the implementation. Snow indicated to us the ease in which Snow Atlas could be implemented, however this did not factor in that we were migrating from their on-prem product Snow License Manager hosted through a Managed Servicer Partner. For a clean installation, your implementation can be quick and likely not require a Project. If you are migrating from another products or are a company that can have lots of stakeholders, fingers in the pie, hurdles/business processes that need to be adhered to, definitely use a Project to perform your implementation.
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost