Likelihood to Recommend If you need a managed big data megastore, which has native integration with highly optimized
Apache Spark Engine and native integration with MLflow, go for Databricks Lakehouse Platform. The Databricks Lakehouse Platform is a breeze to use and analytics capabilities are supported out of the box. You will find it a bit difficult to manage code in notebooks but you will get used to it soon.
Read full review I've created a number of daisy chain notebooks for different workflows, and every time, I create my workflows with other users in mind. Jupiter Notebook makes it very easy for me to outline my thought process in as granular a way as I want without using innumerable small. inline comments.
Read full review Pros Process raw data in One Lake (S3) env to relational tables and views Share notebooks with our business analysts so that they can use the queries and generate value out of the data Try out PySpark and Spark SQL queries on raw data before using them in our Spark jobs Modern day ETL operations made easy using Databricks. Provide access mechanism for different set of customers Read full review Simple and elegant code writing ability. Easier to understand the code that way. The ability to see the output after each step. The ability to use ton of library functions in Python. Easy-user friendly interface. Read full review Cons Connect my local code in Visual code to my Databricks Lakehouse Platform cluster so I can run the code on the cluster. The old databricks-connect approach has many bugs and is hard to set up. The new Databricks Lakehouse Platform extension on Visual Code, doesn't allow the developers to debug their code line by line (only we can run the code). Maybe have a specific Databricks Lakehouse Platform IDE that can be used by Databricks Lakehouse Platform users to develop locally. Visualization in MLFLOW experiment can be enhanced Read full review Need more Hotkeys for creating a beautiful notebook. Sometimes we need to download other plugins which messes [with] its default settings. Not as powerful as IDE, which sometimes makes [the] job difficult and allows duplicate code as it get confusing when the number of lines increases. Need a feature where [an] error comes if duplicate code is found or [if a] developer tries the same function name. Read full review Usability Because it is an amazing platform for designing experiments and delivering a deep dive analysis that requires execution of highly complex queries, as well as it allows to share the information and insights across the company with their shared workspaces, while keeping it secured. in terms of graph generation and interaction it could improve their UI and UX
Read full review Jupyter is highly simplistic. It took me about 5 mins to install and create my first "hello world" without having to look for help. The UI has minimalist options and is quite intuitive for anyone to become a pro in no time. The lightweight nature makes it even more likeable.
Read full review Support Rating One of the best customer and technology support that I have ever experienced in my career. You pay for what you get and you get the Rolls Royce. It reminds me of the customer support of SAS in the 2000s when the tools were reaching some limits and their engineer wanted to know more about what we were doing, long before "data science" was even a name. Databricks truly embraces the partnership with their customer and help them on any given challenge.
Read full review I haven't had a need to contact support. However, all required help is out there in public forums.
Read full review Alternatives Considered Compared to
Synapse &
Snowflake , Databricks provides a much better development experience, and deeper configuration capabilities. It works out-of-the-box but still allows you intricate customisation of the environment. I find Databricks very flexible and resilient at the same time while
Synapse and
Snowflake feel more limited in terms of configuration and connectivity to external tools.
Read full review With Jupyter Notebook besides doing data analysis and performing complex visualizations you can also write machine learning algorithms with a long list of libraries that it supports. You can make better predictions, observations etc. with it which can help you achieve better business decisions and save cost to the company. It stacks up better as we know Python is more widely used than R in the industry and can be learnt easily. Unlike
PyCharm jupyter notebooks can be used to make documentations and exported in a variety of formats.
Read full review Return on Investment The ability to spin up a BIG Data platform with little infrastructure overhead allows us to focus on business value not admin DB has the ability to terminate/time out instances which helps manage cost. The ability to quickly access typical hard to build data scenarios easily is a strength. Read full review Positive impact: flexible implementation on any OS, for many common software languages Positive impact: straightforward duplication for adaptation of workflows for other projects Negative impact: sometimes encourages pigeonholing of data science work into notebooks versus extending code capability into software integration Read full review ScreenShots