Apache Airflow vs. Apache Spark

Overview
ProductRatingMost Used ByProduct SummaryStarting Price
Apache Airflow
Score 8.6 out of 10
N/A
Apache Airflow is an open source tool that can be used to programmatically author, schedule and monitor data pipelines using Python and SQL. Created at Airbnb as an open-source project in 2014, Airflow was brought into the Apache Software Foundation’s Incubator Program 2016 and announced as Top-Level Apache Project in 2019. It is used as a data orchestration solution, with over 140 integrations and community support.N/A
Apache Spark
Score 8.9 out of 10
N/A
N/AN/A
Pricing
Apache AirflowApache Spark
Editions & Modules
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
Apache AirflowApache Spark
Free Trial
NoNo
Free/Freemium Version
YesNo
Premium Consulting/Integration Services
NoNo
Entry-level Setup FeeNo setup feeNo setup fee
Additional Details——
More Pricing Information
Community Pulse
Apache AirflowApache Spark
Considered Both Products
Apache Airflow
Chose Apache Airflow
Overall using Apache Airflow is easy to use compare than other other tools available in the market, It is easy to integrate in apache airflow and the workflow can be monitored and scheduling can be done easily using apache airflow, recommend this tool for Automating the data …
Apache Spark
Chose Apache Spark
Apache Spark is a fast-processing in-memory computing framework. It is 10 times faster than Apache Hadoop. Earlier we were using Apache Hadoop for processing data on the disk but now we are shifted to Apache Spark because of its in-memory computation capability. Also in SAP …
Top Pros
Top Cons
Features
Apache AirflowApache Spark
Workload Automation
Comparison of Workload Automation features of Product A and Product B
Apache Airflow
9.7
10 Ratings
16% above category average
Apache Spark
-
Ratings
Multi-platform scheduling9.910 Ratings00 Ratings
Central monitoring9.910 Ratings00 Ratings
Logging9.810 Ratings00 Ratings
Alerts and notifications9.810 Ratings00 Ratings
Analysis and visualization9.810 Ratings00 Ratings
Application integration8.910 Ratings00 Ratings
Best Alternatives
Apache AirflowApache Spark
Small Businesses

No answers on this topic

No answers on this topic

Medium-sized Companies
ActiveBatch Workload Automation
ActiveBatch Workload Automation
Score 8.1 out of 10
Cloudera Manager
Cloudera Manager
Score 9.9 out of 10
Enterprises
Control-M
Control-M
Score 9.3 out of 10
IBM Analytics Engine
IBM Analytics Engine
Score 7.8 out of 10
All AlternativesView all alternativesView all alternatives
User Ratings
Apache AirflowApache Spark
Likelihood to Recommend
9.0
(10 ratings)
9.4
(24 ratings)
Likelihood to Renew
-
(0 ratings)
10.0
(1 ratings)
Usability
10.0
(1 ratings)
8.7
(4 ratings)
Support Rating
-
(0 ratings)
8.7
(4 ratings)
User Testimonials
Apache AirflowApache Spark
Likelihood to Recommend
Apache
For a quick job scanning of status and deep-diving into job issues, details, and flows, AirFlow does a good job. No fuss, no muss. The low learning curve as the UI is very straightforward, and navigating it will be familiar after spending some time using it. Our requirements are pretty simple. Job scheduler, workflows, and monitoring. The jobs we run are >100, but still is a lot to review and troubleshoot when jobs don't run. So when managing large jobs, AirFlow dated UI can be a bit of a drawback.
Read full review
Apache
Well suited: To most of the local run of datasets and non-prod systems - scalability is not a problem at all. Including data from multiple types of data sources is an added advantage. MLlib is a decently nice built-in library that can be used for most of the ML tasks. Less appropriate: We had to work on a RecSys where the music dataset that we used was around 300+Gb in size. We faced memory-based issues. Few times we also got memory errors. Also the MLlib library does not have support for advanced analytics and deep-learning frameworks support. Understanding the internals of the working of Apache Spark for beginners is highly not possible.
Read full review
Pros
Apache
  • In charge of the ETL processes.
  • As there is no incoming or outgoing data, we may handle the scheduling of tasks as code and avoid the requirement for monitoring.
Read full review
Apache
  • Rich APIs for data transformation making for very each to transform and prepare data in a distributed environment without worrying about memory issues
  • Faster in execution times compare to Hadoop and PIG Latin
  • Easy SQL interface to the same data set for people who are comfortable to explore data in a declarative manner
  • Interoperability between SQL and Scala / Python style of munging data
Read full review
Cons
Apache
  • they should bring in some time based scheduling too not only event based
  • they do not store the metadata due to which we are not able to analyze the workflows
  • they only support python as of now for scripted pipeline writing
Read full review
Apache
  • Memory management. Very weak on that.
  • PySpark not as robust as scala with spark.
  • spark master HA is needed. Not as HA as it should be.
  • Locality should not be a necessity, but does help improvement. But would prefer no locality
Read full review
Likelihood to Renew
Apache
No answers on this topic
Apache
Capacity of computing data in cluster and fast speed.
Read full review
Usability
Apache
Easy to learn Easy to use Robust workflow orchestration framework Good in dependent job management
Read full review
Apache
If the team looking to use Apache Spark is not used to debug and tweak settings for jobs to ensure maximum optimizations, it can be frustrating. However, the documentation and the support of the community on the internet can help resolve most issues. Moreover, it is highly configurable and it integrates with different tools (eg: it can be used by dbt core), which increase the scenarios where it can be used
Read full review
Support Rating
Apache
No answers on this topic
Apache
1. It integrates very well with scala or python. 2. It's very easy to understand SQL interoperability. 3. Apache is way faster than the other competitive technologies. 4. The support from the Apache community is very huge for Spark. 5. Execution times are faster as compared to others. 6. There are a large number of forums available for Apache Spark. 7. The code availability for Apache Spark is simpler and easy to gain access to. 8. Many organizations use Apache Spark, so many solutions are available for existing applications.
Read full review
Alternatives Considered
Apache
There are a number of reasons to choose Apache Airflow over other similar platforms- Integrations—ready-to-use operators allow you to integrate Airflow with cloud platforms (Google, AWS, Azure, etc) Apache Airflow helps with backups and other DevOps tasks, such as submitting a Spark job and storing the resulting data on a Hadoop cluster It has machine learning model training, such as triggering a Sage maker job.
Read full review
Apache
Spark in comparison to similar technologies ends up being a one stop shop. You can achieve so much with this one framework instead of having to stitch and weave multiple technologies from the Hadoop stack, all while getting incredibility performance, minimal boilerplate, and getting the ability to write your application in the language of your choosing.
Read full review
Return on Investment
Apache
  • A lot of helpful features out-of-the-box, such as the DAG visualizations and task trees
  • Allowed us to implement complex data pipelines easily and at a relatively low cost
Read full review
Apache
  • Business leaders are able to take data driven decisions
  • Business users are able access to data in near real time now . Before using spark, they had to wait for at least 24 hours for data to be available
  • Business is able come up with new product ideas
Read full review
ScreenShots