Denodo is the eponymous data integration platform from the global company headquartered in Silicon Valley.
N/A
ZAP Data Hub
Score 10.0 out of 10
N/A
ZAP provides data management & analytics software, with optimized solutions for Microsoft Dynamics, Sage, SAP, and PowerBI. ZAP Data Hub is an ELT data warehouse automation software that helps to deliver accurate, trusted financial and operational reporting in BI tools.
Denodo allows us to create and combine new views to create a virtual repository and APIs without a single line of code. It is excellent because it can present connectors with a view format for downstream consumers by flattening a JSON file. Reading or connecting to various sources and displaying a tabular view is an excellent feature. The product's technical data catalog is well-organized.
ZAP is an incredible tool for those IT Organizations that need to stay lean. Within days, I was able to implement the ZAP solution instead of hiring Developers and ETL developers to tie into multiple Data infrastructures.
Caching - but I am sure it will be improved by now. There were times when we expected the cache to be refreshed but it was stale.
Schema generation of endpoints from API response was sometimes incomplete as not all API calls returned all the fields. Will be good to have an ability to load the schema itself (XSD/JSON/Soap XML etc).
Denodo exposed web services were in preliminary stage when we used; I'm sure it will be improved by now.
Export/Import deployment, while it was helpful, there were unexpected issues without any errors during deployment. Issues were only identified during testing. Some views were not created properly and did not work. If it was working in the environment from where it was exported from, it should work in the environment where it is imported.
I would love to see a strong integration with Google Analytics. ZAP works great with CRM platforms but would be interested to get the metrics on how marketing campaigns are tied to organic traffic and PPC.
Denodo is a tool to rapidly mash data sources together and create meaningful datasets. It does have its downfalls though. When you create larger, more complex datasets, you will most likely need to cache your datasets, regardless of how proper your joins are set up. Since DV takes data from multiple environments, you are taxing the corporate network, so you need to be conscious of how much data you are sending through the network and truly understand how and when to join datasets due to this.