The Dataiku platform unifies all data work, from analytics to Generative AI. It can modernize enterprise analytics and accelerate time to insights with visual, cloud-based tooling for data preparation, visualization, and workflow automation.
N/A
Informatica PowerCenter
Score 7.6 out of 10
N/A
Informatica PowerCenter is a metadata driven data integration technology designed to form the foundation for data integration initiatives, including analytics and data warehousing, application migration, or consolidation and data governance.
Dataiku DSS is very well suited to handle large datasets and projects which requires a huge team to deliver results. This allows users to collaborate with each other while working on individual tasks. The workflow is easily streamlined and every action is backed up, allowing users to revert to specific tasks whenever required. While Dataiku DSS works seamlessly with all types of projects dealing with structured datasets, I haven't come across projects using Dataiku dealing with images/audio signals. But a workaround would be to store the images as vectors and perform the necessary tasks.
1.- Scenaries with poor sources of data is not recomended (Very bad ROI). The solution is for medium-big enterprises with a lot of sources of data and users. 2.- Bank and finance enviroment to integrate differente data form trading, Regulatory reports, decisions makers, fraud and financial crimes because in this kind of scenary the quality of data is the base of the business. 3.- Departments of development and test of applications in enterprises because you can design enviroments, out of the production systems, to development and test the new API's or updateds made.
Informatica Powercenter is an innovative software that works with ETL-type data integration. Connectivity to almost all the database systems.
Great documentation and customer support.
It has a various solution to address data quality issues. data masking, data virtualization. It has various supporting tools or MDM, IDQ, Analyst, BigData which can be used to analyze data and correct it.
There are too many ways to perform the same or similar functions which in turn makes it challenging to trace what a workflow is doing and at which point (ex. sessions can be designed as static or re-usable and the override can occur at the session or workflow, or both which can be counter productive and confusing when troubleshooting).
The power in structured design is a double edged sword. Simple tasks for a POC can become cumbersome. Ex. if you want to move some data to test a process, you first have to create your sources by importing them which means an ODBC connection or similar will need to be configured, you in turn have to develop your targets and all of the essential building blocks before being able to begin actual development. While I am on sources and targets, I think of a table definition as just that and find it counter intuitive to have to design a table as both a source and target and manage them as different objects. It would be more intuitive to have a table definition and its source/target properties defined by where you drag and drop it in the mapping.
There are no checkpoints or data viewer type functions without designing an entire mapping and workflow. If you would like to simply run a job up to a point and check the throughput, an entire mapping needs to be completed and you would workaround this by creating a flat file target.
As I have described earlier, the intuitiveness of this tool makes it great as well as the variety of users that can use this tool. Also, the plugins available in their repository provide solutions to various data science problems.
Positives; - Multi User Development Environment - Speed of transformation - Seamless integration between other Informatica products. Negatives; - There should be less windows to maintain developers' focus while using. You probably need 2 big monitors when you start development with Informatica Power Center. - Oracle Analytical functions should be natively used. - E-LT support as well as ETL support.
PowerCenter is robust and fast, and it does a great job meeting all the needs, not just the most commercially vocal needs. In the hands of an expert power user, you can accomplish almost anything with your data. It is not for new users or intermittent users-- for that the Cloud version is a better fit. Be prepared for costly connectors (priced differently for each source or destination you are working with), and just be planful of your projects so you are not paying for connectors you no longer need or want
The support team is very helpful, and even when we discover the missing features, after providing enough rational reasons and requirements, they put into it their development pipeline for the future release.
Informatica power center is a leader of the pack of ETL tools and has some great abilities that make it stand out from other ETL tools. It has been a great partner to its clients over a long time so it's definitely dependable. With all the great things about Informatica, it has a bit of tech burden that should be addressed to make it more nimble, reduce the learning curve for new developers, provide better connectivity with visualization tools.
Strictly for Data Science operations, Anaconda can be considered as a subset of Dataiku DSS. While Anaconda supports Python and R programming languages, Dataiku also provides this facility, but also provides GUI to creates models with just a click of a button. This provides the flexibility to users who do not wish to alter the model hyperparameters in greater depths. Writing codes to extract meaningful data is time consuming compared to Dataiku's ability to perform feature engineering and data transformation through click of a button.
While Talend offers a much more comfortable interface to work with, Informatica's forte is performance. And on that front, Informatica Enterprise Data Integration certainly leaves Talend in the dust. For a more back-end-centric use case, Informatica is certainly the ETL tool of choice. On the other hand, if business users would be using the tool, then Talend would be the preferred tool.
The data pipeline automation capability of Informatica means that few resources are needed to pre-process the data that ultimately resides in a Data Warehouse. Once a workflow is implemented, manual intervention is not needed.
PowerCenter did require more resources and time for installation and configuration than was expected/planned for.
The lack of or minimal support of unstructured data means that newer sources of dynamic/changing data cannot be easily processed/transformed through PowerCenter workflows.