Positive review for Databricks Lakehouse Platform
August 13, 2021

Positive review for Databricks Lakehouse Platform

Anonymous | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User

Overall Satisfaction with Databricks Lakehouse Platform (Unified Analytics Platform)

We currently use the Databricks Lakehouse Platform for a client. My team specifically uses it to data-mine, create reports and analytics for the client. Depending on where the data is stored, various Analytics teams in my company use different platforms - GCP, AWS, Databricks, etc.
  • Scheduling jobs to automate queries
  • User friendly - a new user can easily navigate through SQL/Python queries
  • Options to code in multiple languages (SQL, Python, Scala, R) and easy to switch with the use of the % operator
  • Errors can be difficult to understand at times
  • Session resets automatically at times, which leads to the temporary tables being wiped out from memory
  • Git connections are dicey
  • Very inconsistent with job success/failure notification emails
  • Ability to schedule jobs
  • Ability to switch between different languages in one script
  • Ability to store permanent tables on datalakes that can be called up on a later date
  • Helps with client deliverables and automated report creation without manually running a script every time
  • Supports BI dashboards for engineering teams
  • Working with multiple datasets is easy

Do you think Databricks Lakehouse Platform delivers good value for the price?


Are you happy with Databricks Lakehouse Platform's feature set?


Did Databricks Lakehouse Platform live up to sales and marketing promises?

I wasn't involved with the selection/purchase process

Did implementation of Databricks Lakehouse Platform go as expected?

I wasn't involved with the implementation phase

Would you buy Databricks Lakehouse Platform again?


Databricks is great for beginner as well as advanced coders. The interface is extremely user-friendly and the learning curve is quite short. It is well suited for automation where we can have scripts running late at night when the load is less and wake up to an email notification of success or failure. It is also well suited for writing codes that require the use of multiple languages (in some cases of data modeling)

The ability to store temporary/permanent tables on data lakes is a fabulous feature as well. PySpark is an excellent language to learn and it works really fast with large datasets.