What users are saying about
4 Ratings
<a href='https://www.trustradius.com/static/about-trustradius-scoring' target='_blank' rel='nofollow noopener noreferrer'>trScore algorithm: Learn more.</a>Score 8.8 out of 100
Based on 4 reviews and ratings
24 Ratings
<a href='https://www.trustradius.com/static/about-trustradius-scoring' target='_blank' rel='nofollow noopener noreferrer'>trScore algorithm: Learn more.</a>Score 8.5 out of 100
Based on 24 reviews and ratings
Likelihood to Recommend
Apache Sqoop
Sqoop is great for sending data between a JDBC compliant database and a Hadoop environment. Sqoop is built for those who need a few simple CLI options to import a selection of database tables into Hadoop, do large dataset analysis that could not commonly be done with that database system due to resource constraints, then export the results back into that database (or another). Sqoop falls short when there needs to be some extra, customized processing between database extract, and Hadoop loading, in which case Apache Spark's JDBC utilities might be preferred
Consultant
Avalon Consulting, LLCInformation Technology and Services, 51-200 employees
Databricks Lakehouse Platform
Databricks has helped my teams write PySpark and Spark SQL jobs and test them out before formally integrating them in Spark jobs. Through Databricks we can create parquet and JSON output files. Datamodelers and scientists who are not very good with coding can get good insight into the data using the notebooks that can be developed by the engineers.

Verified User
Team Lead in Engineering
Financial Services Company, 10,001+ employeesFeature Rating Comparison
Platform Connectivity
Apache Sqoop
—
Databricks Lakehouse Platform
8.3
Connect to Multiple Data Sources
Apache Sqoop
—
Databricks Lakehouse Platform
9.0
Extend Existing Data Sources
Apache Sqoop
—
Databricks Lakehouse Platform
9.0
Automatic Data Format Detection
Apache Sqoop
—
Databricks Lakehouse Platform
7.0
Data Exploration
Apache Sqoop
—
Databricks Lakehouse Platform
6.0
Visualization
Apache Sqoop
—
Databricks Lakehouse Platform
6.0
Interactive Data Analysis
Apache Sqoop
—
Databricks Lakehouse Platform
6.0
Data Preparation
Apache Sqoop
—
Databricks Lakehouse Platform
8.0
Interactive Data Cleaning and Enrichment
Apache Sqoop
—
Databricks Lakehouse Platform
8.0
Data Transformations
Apache Sqoop
—
Databricks Lakehouse Platform
9.0
Data Encryption
Apache Sqoop
—
Databricks Lakehouse Platform
7.0
Built-in Processors
Apache Sqoop
—
Databricks Lakehouse Platform
8.0
Platform Data Modeling
Apache Sqoop
—
Databricks Lakehouse Platform
8.3
Multiple Model Development Languages and Tools
Apache Sqoop
—
Databricks Lakehouse Platform
9.0
Automated Machine Learning
Apache Sqoop
—
Databricks Lakehouse Platform
8.0
Single platform for multiple model development
Apache Sqoop
—
Databricks Lakehouse Platform
9.0
Self-Service Model Delivery
Apache Sqoop
—
Databricks Lakehouse Platform
7.0
Model Deployment
Apache Sqoop
—
Databricks Lakehouse Platform
7.5
Flexible Model Publishing Options
Apache Sqoop
—
Databricks Lakehouse Platform
7.0
Security, Governance, and Cost Controls
Apache Sqoop
—
Databricks Lakehouse Platform
8.0
Pros
Apache Sqoop
- Provides generalized JDBC extensions to migrate data between most database systems
- Generates Java classes upon reading database records for use in other code utilizing Hadoop's client libraries
- Allows for both import and export features
Consultant
Avalon Consulting, LLCInformation Technology and Services, 51-200 employees
Databricks Lakehouse Platform
- Extremely Flexible in Data Scenarios
- Fantastic Performance
- DB is always updating the system so we can have latest features.

Verified User
Director in Information Technology
Financial Services Company, 201-500 employeesCons
Apache Sqoop
- Sqoop2 development seems to have stalled. I have set it up outside of a Cloudera CDH installation, and I actually prefer it's "Sqoop Server" model better than just the CLI client version that is Sqoop1. This works especially well in a microservices environment, where there would be only one place to maintain the JDBC drivers to use for Sqoop.
Consultant
Avalon Consulting, LLCInformation Technology and Services, 51-200 employees
Databricks Lakehouse Platform
- The navigation through which one would create a workspace is a bit confusing at first. It takes a couple minutes to figure out how to create a folder and upload files since it is not the same as traditional file systems such as box.com
- Also, when you create a table, if you forgot to copy the link where the table is stored, it is hard to relocate it. Most of the time I would have to delete the table and re-created.
Freelance Translator
ZOO Digital Group plcEntertainment, 501-1000 employees
Usability
Apache Sqoop
No score
No answers yet
No answers on this topic
Databricks Lakehouse Platform
Databricks Lakehouse Platform 9.0
Based on 1 answer
This has been very useful in my organization for shared notebooks, integrated data pipeline automation and data sources integrations. Integration with AWS is seamless. Non tech users can easily learn how to use Databricks. You can have your company LDAP connect to it for login based access controls to some extent

Verified User
Team Lead in Engineering
Financial Services Company, 10,001+ employeesAlternatives Considered
Apache Sqoop
- Sqoop comes preinstalled on the major Hadoop vendor distributions as the recommended product to import data from relational databases. The ability to extend it with additional JDBC drivers makes it very flexible for the environment it is installed within.
- Spark also has a useful JDBC reader, and can manipulate data in more ways than Sqoop, and also upload to many other systems than just Hadoop.
- Kafka Connect JDBC is more for streaming database updates using tools such as Oracle GoldenGate or Debezium.
- Streamsets and Apache NiFi both provide a more "flow based programming" approach to graphically laying out connectors between various systems, including JDBC and Hadoop.
Consultant
Avalon Consulting, LLCInformation Technology and Services, 51-200 employees
Databricks Lakehouse Platform
Easier to set up and get started. Less of a learning curve.

Verified User
Director in Engineering
Financial Services Company, 10,001+ employeesReturn on Investment
Apache Sqoop
- When combined with Cloudera's HUE, it can enable non-technical users to easily import relational data into Hadoop.
- Being able to manipulate large datasets in Hadoop, and them load them into a type of "materialized view" in an external database system has yielded great insights into the Hadoop datalake without continuously running large batch jobs.
- Sqoop isn't very user-friendly for those uncomfortable with a CLI.
Consultant
Avalon Consulting, LLCInformation Technology and Services, 51-200 employees
Databricks Lakehouse Platform
- Rapid growth of analytics within our company.
- Cost model aligns with usage allowing us to make a reasonable initial investment and scale the cost as we realize the value.
- Platform is easy to learn and Databricks provides excellent support and training.
- Platform does not require a large DevOPs investment

Verified User
Strategist in Engineering
Computer Hardware Company, 10,001+ employeesPricing Details
Apache Sqoop
General
Free Trial
—Free/Freemium Version
—Premium Consulting/Integration Services
—Entry-level set up fee?
No
Apache Sqoop Editions & Modules
—
Additional Pricing Details
—Databricks Lakehouse Platform
General
Free Trial
—Free/Freemium Version
—Premium Consulting/Integration Services
—Entry-level set up fee?
No
Databricks Lakehouse Platform Editions & Modules
Edition
Standard | $0.071 |
---|---|
Premium | $0.101 |
Enterprise | $0.131 |
- Per DBU