Databricks Lakehouse Platform

Databricks Lakehouse Platform

About TrustRadius Scoring
Score 8.7 out of 100
Databricks Lakehouse Platform (Unified Analytics Platform)

Overview

Recent Reviews

Best in the industry

9
February 08, 2022
This product is used for Data Science project development, from data analysis/wrangling to feature creation, to training, to finetuning …

Data for insights

7
July 12, 2021
[Databricks Lakehouse Platform (Unified Analytics Platform) is] used by a few departments to start off with data warehousing. SQL …

Databricks--a good all-rounder

9
May 28, 2021
We use Databricks Lakehouse Platform (Unified Analytics Platform) in our ETL process (data loading) to perform transformations and to …
Read full review

Databricks for modern day ETL

9
January 31, 2019
Data from APIs is streamed into our One Lake environment. This one lake is S3 on AWS.
Once this raw data is on S3, we use Databricks to …
Read full review

Databricks Review

9
August 22, 2018
We leverage Databricks (DB) to run Big Data workloads. Primarily we build a Jar and attach to DB. We do not leverage the notebooks except …
Read full review

Databricks Review

6
September 15, 2017
Across whole organization.

[It's] Used by self-service analysts to quickly do analysis
Read full review

Reviewer Pros & Cons

View all pros & cons

Video Reviews

Leaving a video review helps other professionals like you evaluate products. Be the first one in your network to record a review of Databricks Lakehouse Platform, and make your voice heard!

Pricing

View all pricing

Standard

$0.07

Cloud
Per DBU

Premium

$0.10

Cloud
Per DBU

Enterprise

$0.13

Cloud
Per DBU

Entry-level set up fee?

  • No setup fee

Offerings

  • Free Trial
  • Free/Freemium Version
  • Premium Consulting / Integration Services

Features Scorecard

No scorecards have been submitted for this product yet..

Product Details

What is Databricks Lakehouse Platform?

Databricks in San Francisco offers the Databricks Lakehouse Platform (formerly the Unified Analytics Platform), a data science platform and Apache Spark cluster manager. The Databricks Unified Data Service aims to provide a reliable and scalable platform for data pipelines, data lakes, and data platforms. Users can manage full data journey, to ingest, process, store, and expose data throughout an organization. Its Data Science Workspace is a collaborative environment for practitioners to run all analytic processes in one place, and manage ML models across the full lifecycle. The Machine Learning Runtime (MLR) provides data scientists and ML practitioners with scalable clusters that include popular frameworks, built-in AutoML and optimizations.

Databricks Lakehouse Platform Technical Details

Deployment TypesSaaS
Operating SystemsUnspecified
Mobile ApplicationNo

Comparisons

View all alternatives

Frequently Asked Questions

What is Databricks Lakehouse Platform?

Databricks in San Francisco offers the Databricks Lakehouse Platform (formerly the Unified Analytics Platform), a data science platform and Apache Spark cluster manager. The Databricks Unified Data Service aims to provide a reliable and scalable platform for data pipelines, data lakes, and data platforms. Users can manage full data journey, to ingest, process, store, and expose data throughout an organization. Its Data Science Workspace is a collaborative environment for practitioners to run all analytic processes in one place, and manage ML models across the full lifecycle. The Machine Learning Runtime (MLR) provides data scientists and ML practitioners with scalable clusters that include popular frameworks, built-in AutoML and optimizations.

What is Databricks Lakehouse Platform's best feature?

Reviewers rate Usability highest, with a score of 9.

Who uses Databricks Lakehouse Platform?

The most common users of Databricks Lakehouse Platform are from Enterprises (1,001+ employees) and the Information Technology & Services industry.

Reviews

(1-15 of 15)
Companies can't remove reviews or game the system. Here's why
Score 8 out of 10
Vetted Review
Verified User
Review Source
We used Databricks Lakehouse platform for running all our Machine Learning workloads as well as storing large amounts of data in our data lake backend. The data stored in the databricks lakehouse was used to train state-of-the-art ML and Deep Learning models on text and image datasets. Databricks' Spark jobs as well as Delta Lake Lakehouse backend is well equipped for these kinds of tasks.
February 08, 2022

Best in the industry

Jonatan Bouchard | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
Review Source
This product is used for Data Science project development, from data analysis/wrangling to feature creation, to training, to finetuning and to model test and validation, and finally to deployment. While Databricks is used by many users, we also use GitHub and code Q/A to promote a code in production. This is one of the advantages of Databricks is the integration part, not only Git but whether you use it on Azure or AWS, you can also leverage the power of the integrated Machine Learning in those platforms, such as auto ml or Azure ML.
Score 8 out of 10
Vetted Review
Verified User
Review Source
It is currently used by our Data and Product teams in order to perform deep dives analysis on how our current metrics are performing (KPIs, OKRs), to develop tools for metric predictions based on data models in languages such as SQL and Python while mixing them and giving to the entire company visibility of the results with graphs via shared workspaces
Score 9 out of 10
Vetted Review
Verified User
Review Source
We currently use the Databricks Lakehouse Platform for a client. My team specifically uses it to data-mine, create reports and analytics for the client. Depending on where the data is stored, various Analytics teams in my company use different platforms - GCP, AWS, Databricks, etc.
Score 10 out of 10
Vetted Review
Verified User
Review Source
We use Databricks to replace traditional RDBMS like Oracle. We have Big Batch ETL, Ingestion and Extraction Job for Big data ran across different products where we leverage Lakehouse platform to put our raw data in Data Lake and Create Delta Lake platform based on high performing Parquet.
It is kind of proposed to use across the whole organization and different BU's. Databricks will be our key main virtualized platform.
It addresses very fast data ingestion, reduces the overall ETL window. Integrated different datasource and also helps to perform Machine Learning jobs to run and scale. Idea is to reduce overall computation time to save cost on onprem.
Surendranatha Reddy Chappidi | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
Review Source
Databricks Lakehouse platform is used across all departments in my current organization.
It is used as part of solving different data engineering and data analytics use cases in different teams.
Databricks Lakehouse platform provides seamless integration with Azure cloud in Maersk. Databricks Lakehouse platform uses spark, mlops, delta for slovong the recent big data engineering problems.
Score 9 out of 10
Vetted Review
Verified User
Review Source
We use Databricks Lakehouse Platform (Unified Analytics Platform) in our ETL process (data loading) to perform transformations and to implement the toughest loading strategies on huge datasets. It is very easy to understand and it can connect to almost all the modern data formats like Avro, Parquet, and JSON. It supports almost every popular cloud platform, like Azure and AWS, and offers better performance in terms of data processing speed.
Score 8 out of 10
Vetted Review
Verified User
Review Source
We use Databricks Lakehouse Platform to transform IoT data and build data models for BI tools. It is being used by engineering and IT teams. We use it with a data lake platform, read the raw data and transform it to a suitable format for analytics tools. We run daily/hourly jobs to create BI models and save the resulting models back to data lake or SQL tables.
Score 9 out of 10
Vetted Review
Verified User
Review Source
Data from APIs is streamed into our One Lake environment. This one lake is S3 on AWS.
Once this raw data is on S3, we use Databricks to write Spark SQL queries and pySpark to process this data into relational tables and views.

Then those views are used by our data scientists and modelers to generate business value and use in lot of places like creating new models, creating new audit files, exports etc.
Ann Le | TrustRadius Reviewer
Score 7 out of 10
Vetted Review
Verified User
Review Source
I actually use Databricks for experiments and research for my master's program. I mostly use it to implement Python codes and testing the viability of the programs that I write. Many individuals in the Computer Information System department are using this software platform to implement programs. It is a good tool for us to learn [and] includes a community forum that is rather helpful if you are self-learning and have questions.