Databricks in San Francisco offers the Databricks Lakehouse Platform (formerly the Unified Analytics Platform), a data science platform and Apache Spark cluster manager. The Databricks Unified Data Service aims to provide a reliable and scalable platform for data pipelines, data lakes, and data platforms. Users can manage full data journey, to ingest, process, store, and expose data throughout an organization. Its Data Science Workspace is a collaborative environment for practitioners to run…
$0.07
Per DBU
Snowflake
Score 8.8 out of 10
N/A
The Snowflake Cloud Data Platform is the eponymous data warehouse with, from the company in San Mateo, a cloud and SQL based DW that aims to allow users to unify, integrate, analyze, and share previously siloed data in secure, governed, and compliant ways. With it, users can securely access the Data Cloud to share live data with customers and business partners, and connect with other organizations doing business as data consumers, data providers, and data service providers.
Databricks [Lakehouse Platform (Unified Analytics Platform)] can work with all data types in their original format while Snowflake requires additional structures to fit the data before loading it. Databricks is open source so potential is far greater.
Compared to Synapse & Snowflake, Databricks provides a much better development experience, and deeper configuration capabilities. It works out-of-the-box but still allows you intricate customisation of the environment. I find Databricks very flexible and resilient at the same …
Databricks is a true all-in-one platform, and at the time of implementation, it had more features available to us, making it a clear choice over Snowflake. Moving our workloads from local computing to the servers in Databricks gave our start-up staff a great quality of life …
We particularly liked Snowflake's security model as well as its unique storage (whereby everything is essentially a pointer to immutable micro-partitions, which is the key behind its zero-copy cloning, its secure sharing, its time travel, etc.). and also how it separates …
Snowflake is much faster and easier to write queries and pull data. But the visualization part of Snowflake is not as good as them. Also, Snowflake only supports SQL queries but not python or other languages. So basically Snowflake is the expert in its field but not suitable …
I evaluated Redshift and Panoply when making the choice for Snowflake. Panoply is built on Redshift, so the two are equal in drawbacks: Redshift requires a cluster to be running 24/7 for your data to live there. We produce terabytes of data every day, so this was not an option …
Medium to Large data throughput shops will benefit the most from Databricks Spark processing. Smaller use cases may find the barrier to entry a bit too high for casual use cases. Some of the overhead to kicking off a Spark compute job can actually lead to your workloads taking longer, but past a certain point the performance returns cannot be beat.
Snowflake is well suited when you have to store your data and you want easy scalability and increase or decrease the storage per your requirement. You can also control the computing cost, and if your computing cost is less than or equal to 10% of your storage cost, then you don't have to pay for computing, which makes it cost-effective as well.
Snowflake scales appropriately allowing you to manage expense for peak and off peak times for pulling and data retrieval and data centric processing jobs
Snowflake offers a marketplace solution that allows you to sell and subscribe to different data sources
Snowflake manages concurrency better in our trials than other premium competitors
Snowflake has little to no setup and ramp up time
Snowflake offers online training for various employee types
Connect my local code in Visual code to my Databricks Lakehouse Platform cluster so I can run the code on the cluster. The old databricks-connect approach has many bugs and is hard to set up. The new Databricks Lakehouse Platform extension on Visual Code, doesn't allow the developers to debug their code line by line (only we can run the code).
Maybe have a specific Databricks Lakehouse Platform IDE that can be used by Databricks Lakehouse Platform users to develop locally.
Visualization in MLFLOW experiment can be enhanced
Do not force customers to renew for same or higher amount to avoid loosing unused credits. Already paid credits should not expire (at least within a reasonable time frame), independent of renewal deal size.
SnowFlake is very cost effective and we also like the fact we can stop, start and spin up additional processing engines as we need to. We also like the fact that it's easy to connect our SQL IDEs to Snowflake and write our queries in the environment that we are used to
Because it is an amazing platform for designing experiments and delivering a deep dive analysis that requires execution of highly complex queries, as well as it allows to share the information and insights across the company with their shared workspaces, while keeping it secured.
in terms of graph generation and interaction it could improve their UI and UX
Because the fact that you can query tons of data in a few seconds is incredible, it also gives you a lot of functions to format and transform data right in your query, which is ideal when building data models in BI tools like Power BI, it is available as a connector in the most used BI tools worldwide.
One of the best customer and technology support that I have ever experienced in my career. You pay for what you get and you get the Rolls Royce. It reminds me of the customer support of SAS in the 2000s when the tools were reaching some limits and their engineer wanted to know more about what we were doing, long before "data science" was even a name. Databricks truly embraces the partnership with their customer and help them on any given challenge.
We have had terrific experiences with Snowflake support. They have drilled into queries and given us tremendous detail and helpful answers. In one case they even figured out how a particular product was interacting with Snowflake, via its queries, and gave us detail to go back to that product's vendor because the Snowflake support team identified a fault in its operation. We got it solved without lots of back-and-forth or finger-pointing because the Snowflake team gave such detailed information.
The most important differentiating factor for Databricks Lakehouse Platform from these other platforms is support for ACID transactions and the time travel feature. Also, native integration with managed MLflow is a plus. EMR, Cloudera, and Hortonworks are not as optimized when it comes to Spark Job Execution. Other platforms need to be self-managed, which is another huge hassle.
I have had the experience of using one more database management system at my previous workplace. What Snowflake provides is better user-friendly consoles, suggestions while writing a query, ease of access to connect to various BI platforms to analyze, [and a] more robust system to store a large amount of data. All these functionalities give the better edge to Snowflake.