The vendor states that Informatica Data Quality empowers companies to take a holistic approach to managing data quality across the entire organization, and that with Informatica Data Quality, users are able to ensure the success of data-driven digital transformation initiatives and projects across users, types, and scale, while also automating mission-critical tasks.
N/A
Qlik Talend Cloud
Score 8.8 out of 10
N/A
The Qlik Talend Cloud suite of solutions offer data integration, data quality, application integration, and data governance that work with key data sources, targets, architectures, or methodologies to ensure business users always have trusted and accurate data.
N/A
SSIS
Score 7.6 out of 10
N/A
Microsoft's SQL Server Integration Services (SSIS) is a data integration solution.
Compared to Microsoft SQL Server Integration Services (SSIS) talend gives developers much more tools and flexibility in order to achieve different ETL processes. For instance, SSIS, separates processing from data management, and Talend mixes both stages so that you can perform …
For effective data collaboration, systematic verification of customer information, and address, among others, Informatica Data Quality is a fruitful application to consider. Besides, Informatica Data Quality controls quality through a cleansing process, giving the company a professional outline of candid data profiling and reputable analytics. Finally, Informatica Data Quality allows the simplistic navigation of content, with a dashboard that supports predictability.
This tool fits all kinds of organizations and helps to integrate data between many applications. We can use this tool as data integration is a key feature for all organizations. It is also available in the cloud, which makes the integration more seamless. The firm can opt for the required tools when there are no data integration needs.
As I mentioned earlier SQL Server Integration Services is suitable if you want to manage data from different applications. It really helps in fetching the data and generating reports. Its automation make it very easy and time efficient. It works well with large database as well. But it doesn't work well with real time data, it will take some time to gather the real time data. I would not recommend using it in a real time/fast-paced environment.
The matching algorithms in IDQ are very powerful if you understand the different types that they offer (e.g., Hamming Distance, Jaro, Bigram, etc..). We had to play around with it to see which best suit our own needs of identifying and eliminating duplicate customers. Setting up the whole process (e.g., creating the KeyGenerator Transformation, setting up the matching threshold, etc..) can be somewhat time consuming and a challenge if you don't first standardize your data.
The integration with PowerCenter is great if you have both. You can either import your mappings directly to PowerCenter or to an XML file. The only downside is that some of the transformations are unique to IDQ, so you are not really able to edit them once in PowerCenter.
The standardizer transformation was key in helping us standardize our customer data (e.g., names, addresses, etc..). It was helpful due to having create a reference table containing the standardized value and the associated unstandardized values. What was great was that if you used Informatica Analyst, a business analyst could login and correct any of the values.
Talend Data Integration allows us to quickly build data integrations without a tremendous amount of custom coding (some Java and JavaScript knowledge is still required).
I like the UI and it's very intuitive. Jobs are visual, allowing the team members to see the flow of the data, without having to read through the Java code that is generated.
Connection managers for online data sources can be tricky to configure.
Performance tuning is an art form and trialing different data flow task options can be cumbersome. SSIS can do a better job of providing performance data including historical for monitoring.
Mapping destination using OLE DB command is difficult as destination columns are unnamed.
Excel or flat file connections are limited by version and type.
As pointed out earlier, due all the robust features IDQ has, our use f the product is successful and stable. IDQ is being used in multiple sources (from CRM application and in batch mode). As this is an iterative process, we are looking to improve our system efficiency using IDQ.
Some features should be revised or improved, some tools (using it with Visual Studio) of the toolbox should be less schematic and somewhat more flexible. Using for example, the CSV data import is still very old-fashioned and if the data format changes it requires a bit of manual labor to accept the new data structure
We use Talend Data Integration day in and day out. It is the best and easiest tool to jump on to and use. We can build a basic integration super-fast. We could build basic integrations as fast as within the hour. It is also easy to build transformations and use Java to perform some operations.
SSIS is a great tool for most ETL needs. It has the 90% (or more) use cases covered and even in many of the use cases where it is not ideal SSIS can be extended via a .NET language to do the job well in a supportable way for almost any performance workload.
SQL Server Integration Services performance is dependent directly upon the resources provided to the system. In our environment, we allocated 6 nodes of 4 CPUs, 64GB each, running in parallel. Unfortunately, we had to ramp-up to such a robust environment to get the performance to where we needed it. Most of the reports are completed in a reasonable timeframe. However, in the case of slow running reports, it is often difficult if not impossible to cancel the report without killing the report instance or stopping the service.
Good support, specially when it relates to PROD environment. The support team has access to the product development team. Things are internally escalated to development team if there is a bug encountered. This helps the customer to get quick fix or patch designed for problem exceptions. I have also seen support showing their willingness to help develop custom connector for a newly available cloud based big data solution
The support, when necessary, is excellent. But beyond that, it is very rarely necessary because the user community is so large, vibrant and knowledgable, a simple Google query or forum question can answer almost everything you want to know. You can also get prewritten script tasks with a variety of functionality that saves a lot of time.
The implementation may be different in each case, it is important to properly analyze all the existing infrastructure to understand the kind of work needed, the type of software used and the compatibility between these, the features that you want to exploit, to understand what is possible and which ones require integration with third-party tools
IDQ is used by a department at my organisation to ensure and enhance the data quality. The usage was started with address standardization and now it had been brought to altogether a next level of quality check where it fixes duplicates, junk characters, standardize the names, streets, product descriptions. In the past we had issues mainly with duplicate customers and products and this were affecting the sales projection and estimates.
In comparison with the other ETLs I used, Talend is more flexible than Data Services (where you cannot create complex commands). It is similar to Datastage speaking about commands and interfaces. It is more user-friendly than ODI, which has a metadata point of view on its own, while Talend is more classic. It has both on-prem and cloud approaches, while Matillion is only cloud-based.
I think SQL Server Integration Services is better suited for on-premises data movement and ADF is more suited for the cloud. Though ADF has more connectors, SQL Server Integration Services is more robust and has better functionality just because it has been around much longer
It’s only been a positive RoI with Talend given we’ve interfaced large datasets between critical on-Prem and cloud-native apps to efficiently run our business operations.
Without this, we would have to manually update a spreadsheet of our SQL Server inventory
We would also have poor alerting; if an instance was down we wouldn't know until it was reported by a user
We only have one other person who uses SQL Server Integration Services , he's the expert. It would fall to me without him and I would not enjoy being responsible for it.