Azure Databricks is a service available on Microsoft's Azure platform and suite of products. It provides the latest versions of Apache Spark so users can integrate with open source libraries, or spin up clusters and build in a fully managed Apache Spark environment with the global scale and availability of Azure. Clusters are set up, configured, and fine-tuned to ensure reliability and performance without the need for monitoring. The solution includes autoscaling and auto-termination to improve…
N/A
Dataiku
Score 8.2 out of 10
N/A
The Dataiku platform unifies data work from analytics to Generative AI. It supports enterprise analytics with visual, cloud-based tooling for data preparation, visualization, and workflow automation.
N/A
Pricing
Azure Databricks
Dataiku
Editions & Modules
No answers on this topic
Discover
Contact sales team
Business
Contact sales team
Enterprise
Contact sales team
Offerings
Pricing Offerings
Azure Databricks
Dataiku
Free Trial
No
Yes
Free/Freemium Version
No
Yes
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
—
More Pricing Information
Community Pulse
Azure Databricks
Dataiku
Considered Both Products
Azure Databricks
No answer on this topic
Dataiku
Verified User
Engineer
Chose Dataiku
Dataiku was selected for me, but I am happy about that. I like Dataiku for the user experience, it feels less code-y and I like to demo things to non technical stakeholders because they can still follow along. When you open some other notebooks, you can see that peoples eyes …
Centralised notebooks are out directly into production. This can lead to poorly engineered code. It is very good for fast queries and our data team are always able to provide what we ask for. It is a big cost to our business so it is important it runs efficiently and returns on our investment.
Dataiku is an awesome tool for data scientists. It really makes our lives easier. It is also really good for non technical users to see and follow along with the process. I do think that people can fall into the trap of using it without any knowledge at all because so much is automated, but I dont think that is the fault of Dataiku.
The integrated windows of frontend and backend in web applications make it cumbersome for the developer.
When dealing with multiple data flows, it becomes really confusing, though they have introduced a feature (Zones) to cater to this issue.
Bundling, exporting, and importing projects sometimes create issues related to code environment. If the code environment is not available, at least the schema of the flow we should be able to import should be.
The developers are able to switch between Python and SQL in the Notebook which allows the collaboration of SQL analyst and Data scientist. The integration of Mosaic AI allows users to write complex codes in natural languages. Unity catalog has centralized the security and governance features and simplified the process of maintaining it
The user experience is very good. Everything feels intuitive and "flows" (sorry excuse the pun) so nicely, and the customization level is also appropriate to the tool. Even as a newer data scientist, it felt easy to use and the explanations/tutorials were very good. The documentation is also at a good level
The open source user community is friendly, helpful, and responsive, at times even outdoing commercial software vendors. Documentation is also top notch, and usually resolves issues without the need for human interactions. Great product design, with a focus on user experience, also makes platform use intuitive, thus reducing the need for explicit support.
I have found Azure Databricks to be much better than Snowflake for handling bigger, diverse data types. Snowflake is much simpler and better for smaller warehousing. The real time processing is much better in Azure Databricks and we have much more language options. Snowflake is more expensive but simpler to use. Both are great for different needs.
Anaconda is mainly used by professional data scientists who have profound knowledge of Python coding, mainly used for building some new algorithm block or some optimization, then the module will be integrated into the Dataiku pipeline/workflow. While Dataiku can be used by even other kinds of users.