Likelihood to Recommend If you need a managed big data megastore, which has native integration with highly optimized
Apache Spark Engine and native integration with MLflow, go for Databricks Lakehouse Platform. The Databricks Lakehouse Platform is a breeze to use and analytics capabilities are supported out of the box. You will find it a bit difficult to manage code in notebooks but you will get used to it soon.
Read full review Denodo allows us to create and combine new views to create a virtual repository and APIs without a single line of code. It is excellent because it can present connectors with a view format for downstream consumers by flattening a JSON file. Reading or connecting to various sources and displaying a tabular view is an excellent feature. The product's technical data catalog is well-organized.
Read full review Pros Process raw data in One Lake (S3) env to relational tables and views Share notebooks with our business analysts so that they can use the queries and generate value out of the data Try out PySpark and Spark SQL queries on raw data before using them in our Spark jobs Modern day ETL operations made easy using Databricks. Provide access mechanism for different set of customers Read full review Database Agnostic: You can easily connect to different environments and mash up data sets. The "magic" of data virtualization: No data is created, so data is reported in near-real-time to end users. It's easy to use UI for developers. You just connect to a data source, create tables, and join them to other datasets. Read full review Cons Connect my local code in Visual code to my Databricks Lakehouse Platform cluster so I can run the code on the cluster. The old databricks-connect approach has many bugs and is hard to set up. The new Databricks Lakehouse Platform extension on Visual Code, doesn't allow the developers to debug their code line by line (only we can run the code). Maybe have a specific Databricks Lakehouse Platform IDE that can be used by Databricks Lakehouse Platform users to develop locally. Visualization in MLFLOW experiment can be enhanced Read full review Caching - but I am sure it will be improved by now. There were times when we expected the cache to be refreshed but it was stale. Schema generation of endpoints from API response was sometimes incomplete as not all API calls returned all the fields. Will be good to have an ability to load the schema itself (XSD/JSON/Soap XML etc). Denodo exposed web services were in preliminary stage when we used; I'm sure it will be improved by now. Export/Import deployment, while it was helpful, there were unexpected issues without any errors during deployment. Issues were only identified during testing. Some views were not created properly and did not work. If it was working in the environment from where it was exported from, it should work in the environment where it is imported. Read full review Usability Because it is an amazing platform for designing experiments and delivering a deep dive analysis that requires execution of highly complex queries, as well as it allows to share the information and insights across the company with their shared workspaces, while keeping it secured. in terms of graph generation and interaction it could improve their UI and UX
Read full review Denodo is very easy to use. It has a user-friendly drag and drop interface. I'm not a fan of the java platform it resides on.
Read full review Performance Denodo is a tool to rapidly mash data sources together and create meaningful datasets. It does have its downfalls though. When you create larger, more complex datasets, you will most likely need to cache your datasets, regardless of how proper your joins are set up. Since DV takes data from multiple environments, you are taxing the corporate network, so you need to be conscious of how much data you are sending through the network and truly understand how and when to join datasets due to this.
Read full review Support Rating One of the best customer and technology support that I have ever experienced in my career. You pay for what you get and you get the Rolls Royce. It reminds me of the customer support of SAS in the 2000s when the tools were reaching some limits and their engineer wanted to know more about what we were doing, long before "data science" was even a name. Databricks truly embraces the partnership with their customer and help them on any given challenge.
Read full review Alternatives Considered Compared to
Synapse &
Snowflake , Databricks provides a much better development experience, and deeper configuration capabilities. It works out-of-the-box but still allows you intricate customisation of the environment. I find Databricks very flexible and resilient at the same time while
Synapse and
Snowflake feel more limited in terms of configuration and connectivity to external tools.
Read full review Denodo is simple and easy to use. Highly recommended unless you have huge volumes of data
Read full review Return on Investment The ability to spin up a BIG Data platform with little infrastructure overhead allows us to focus on business value not admin DB has the ability to terminate/time out instances which helps manage cost. The ability to quickly access typical hard to build data scenarios easily is a strength. Read full review It is a huge advantage that we can connect to many different databases to provide data rapidly and accurately. It has proven to be a valuable environment for deploying data virtualization solutions, and its user community is active in finding and fixing issues. Read full review ScreenShots