Overview
What is Databricks Data Intelligence Platform?
Databricks in San Francisco offers the Databricks Lakehouse Platform (formerly the Unified Analytics Platform), a data science platform and Apache Spark cluster manager. The Databricks Unified Data Service aims to provide a reliable and scalable platform for data pipelines, data…
Most collaborative Data Science & AI workspace !
Databricks Lakehouse Platform: A 2-year user review
Databricks Lakehouse Platform for all your analytics requirements
Best in the industry
The wonders of all your data analysis in one place
Positive review for Databricks Lakehouse Platform
My Lakehouse experiences
Databricks is Great Platform for Data Virtualization based on Delta Lake
Data for insights
Databricks Lakehouse is modern solutions for current big data problems
It is used as part of solving different data …
Databricks--a good all-rounder
Great for both ad-hoc analyzes and scheduled jobs
Databricks for modern day ETL
Once this raw data is on S3, we use Databricks to …
Databricks provides a cost effective end to end solution for Enterprise analytics
- Ingestion and cleansing of data
- Interactive Analysis of data
- Development of Analytic Services
- Production Environment …
Databricks Review
Awards
Products that are considered exceptional by their customers based on a variety of criteria win TrustRadius awards. Learn more about the types of TrustRadius awards to make the best purchase decision. More about TrustRadius Awards
Reviewer Pros & Cons
Pricing
Standard
$0.07
Premium
$0.10
Enterprise
$0.13
Entry-level set up fee?
- No setup fee
Offerings
- Free Trial
- Free/Freemium Version
- Premium Consulting/Integration Services
Product Details
- About
- Tech Details
- FAQs
What is Databricks Data Intelligence Platform?
Databricks Data Intelligence Platform Technical Details
Deployment Types | Software as a Service (SaaS), Cloud, or Web-Based |
---|---|
Operating Systems | Unspecified |
Mobile Application | No |
Frequently Asked Questions
Comparisons
Compare with
Reviews and Ratings
(75)Community Insights
- Business Problems Solved
- Pros
- Cons
- Recommendations
The Databricks Lakehouse Platform, also known as the Unified Analytics Platform, has been widely used by multiple departments to address a range of data engineering and analytics challenges. Users have leveraged the platform to initiate data warehousing, SQL analytics, real-time monitoring, and data governance. The versatility and openness of the platform have allowed users to save a significant amount of time and effectively manage cloud costs and human resources.
Customers have utilized the Databricks Lakehouse Platform for various use cases, including creating dashboards with tools like Tableau, Redash, and Qlik, as well as integrating with CRM systems like Salesforce and SAP. The platform has also been employed for developing chatbots in Knowledge Management and serving machine learning models behind API endpoints. Furthermore, it is extensively used for data science project development, facilitating tasks such as data analysis, wrangling, feature creation, training, model testing, validation, and deployment.
Databricks' integration capabilities, including Git integration and integration with Azure or AWS, enable users to leverage the power of integrated machine learning features. Additionally, the platform's reliability and excellent technical support make it a preferred choice for building data pipelines and solving big data engineering problems. It is widely used by engineering and IT teams to transform IoT data, build data models for business intelligence tools, and run daily/hourly jobs to create BI models.
Moreover, the Databricks Lakehouse Platform serves as an invaluable learning tool for individuals in the Computer Information System department. The community forum proves particularly helpful for self-learners with questions. Furthermore, the platform supports deep dive analysis on metrics by Data and Product teams, facilitates client reporting and analytics through data mining capabilities, replaces traditional RDBMS like Oracle for Big Batch ETL jobs on big data sets.
In summary, the Databricks Lakehouse Platform is employed across organizations to solve a variety of data engineering and analytics use cases. Its seamless integration with cloud platforms, support for different data formats, and scalability make it suitable for tasks such as data ingestion and cleansing, interactive analysis, and development of analytic services.
User-Friendly SQL: Users have found the SQL in Databricks to be user-friendly, allowing them to easily write and execute queries. Several reviewers have praised the intuitive nature of the SQL interface, making it accessible for users of different skill levels.
Enhanced Collaboration: The enhanced collaboration between data science and data engineering teams is seen as a positive feature by many users. They appreciate how Databricks facilitates seamless communication and knowledge sharing among team members, ultimately leading to improved productivity and efficiency.
Versatile Integration: The integration with multiple Git providers and the merge assistant is highly valued by users. This feature allows for smooth version control and simplifies the collaborative development process. With this capability, developers can easily manage their codebase, track changes, resolve conflicts, and ensure a streamlined workflow.
Confusing Workspace Navigation: Several users have found the navigation to create a workspace in the Databricks Lakehouse Platform confusing and time-consuming, hindering their productivity. They have expressed frustration over the complex steps involved, resulting in wasted time.
Difficulty Locating Tables: Many reviewers have expressed difficulty in locating tables after they were created, often leading to the need for deletion and recreation. This issue has caused frustration and wasted time for users who struggle to find their data within the platform.
Random Task Failures: Some users have experienced random task failures while using the platform, making it challenging for them to debug and profile code effectively. These unexpected failures undermine confidence in the system's stability and result in delays as users attempt to identify and fix these issues.
Users highly recommend the Lakehouse platform for various data-related tasks, such as building cloud-native lakehouse platforms, ingesting and transforming big data batches/streams, and implementing medallion lakehouse architectures. They find the platform simple to use and appreciate its hassle-free administration and maintenance.
The Lakehouse platform is also highly recommended for setting up Hadoop clusters and dealing with big data, analytics, and machine learning workflows. Users believe that it provides a comprehensive and open solution for these tasks.
Users suggest exploring the features of the Lakehouse platform, such as partner connect, advanced analytics/MLOPS/Data science Auto-ML capabilities. They find these features useful and believe that they enhance the platform's salient functionalities.
Overall, users highly recommend the Lakehouse platform for its ease of use, support for major cloud providers (AWS, AZURE, GCP), and useful features like data sharing (Delta Sharing). However, users also recommend considering the level of reliance on proprietary technology versus industry standards like Spark, SQL, and dbt. It is advised to read through the documentation and gather firsthand experiences from individuals who have used the Lakehouse platform.
Attribute Ratings
Reviews
(1-4 of 4)Most collaborative Data Science & AI workspace !
* Creating dashboards with Tableau, Redash, Qlik,
* Feed their CRM tool like Salesforce, SAP,
* developing chatbots for Knowledge Management
* Serve ML models behind API endpoints.
Databricks Lakehouse Platform is a versatile and open product that saves us a lot of time, help us control cloud cost and human resources energy !
- Enhanced Data Science & Data Engineering collaboration
- Complete Infrastructure-as-code Terraform provider
- Very easy streaming capabilities
- Multiple Git providers integration with merge assistant
- VsCode IDE support for local development
- Python SDK for Workflows
- Poetry support
It would be less appropriate for very small data projects as the entry cost may be high. Yet, if the data is meant to grow, Databricks will horizontally scale without requiring a re-write of your codebase
- Unity Catalog
- Collaborative Spark Notebook supporting python, SQL, Scala, R
- Serverless Endpoints
- mlflow integration
- Data Science environment is ready in a matter of minutes, not days.
- Much better cost control
- Easy onboarding for all clouds
It works out-of-the-box but still allows you intricate customisation of the environment.
I find Databricks very flexible and resilient at the same time while Synapse and Snowflake feel more limited in terms of configuration and connectivity to external tools.
Data for insights
- SQL
- User friendly
- Great development environment
- Errors are not explained
- No data back up feature
- Interface can be more intuitive
- Data Warehouse
- Spark computations
- Allows SQL, Scala and R to collaborate on notebooks
- A comprehensive data warehouse for transactions and calculations.
- Cost effective on just using one tool that does most we ask for.
- Fast business insights with data availability
Databricks for modern day ETL
Once this raw data is on S3, we use Databricks to write Spark SQL queries and pySpark to process this data into relational tables and views.
Then those views are used by our data scientists and modelers to generate business value and use in lot of places like creating new models, creating new audit files, exports etc.
- Process raw data in One Lake (S3) env to relational tables and views
- Share notebooks with our business analysts so that they can use the queries and generate value out of the data
- Try out PySpark and Spark SQL queries on raw data before using them in our Spark jobs
- Modern day ETL operations made easy using Databricks. Provide access mechanism for different set of customers
- Databricks should come with a fine grained access control mechanism. If I have tables or views created then access mechanism should be able to restrict access to certain tables or columns based on the logged in user
- There should be improved graphing and dash boarding provided from within Databricks
- Better integration with AWS could help me code jobs in Databricks and run them in AWS EMR more easily using better devops pipelines
- ROI for us has been tremendous. Time to market by processing raw data in our big data infrastructure has been pretty fast.
- Non engineers can easily use Databricks, hence helping business customers.
- Thousands of different data combinations can easily be joined and used by our data teams.
We could have used AWS products, however Databricks notebooks and ability to launch clusters directly from notebooks was seen as a very helpful tool for non tech users.
You can have your company LDAP connect to it for login based access controls to some extent.
Databricks Review
- Extremely Flexible in Data Scenarios
- Fantastic Performance
- DB is always updating the system so we can have latest features.
- Better Localized Testing
- When they were primarily OSS Spark; it was easier to test/manage releases versus the newer DB Runtime. Wish there was more configuration in Runtime less pick a version.
- Graphing Support went non-existent; when it was one of their compelling general engine.
- DB generally fits 95% of what you need to do
- Primarily the ability to transform data and or do ad-hoc DS work
- The ability to spin up a BIG Data platform with little infrastructure overhead allows us to focus on business value not admin
- DB has the ability to terminate/time out instances which helps manage cost.
- The ability to quickly access typical hard to build data scenarios easily is a strength.