Informatica PowerCenter is a metadata driven data integration technology designed to form the foundation for data integration initiatives, including analytics and data warehousing, application migration, or consolidation and data governance.
N/A
TIBCO Data Virtualization
Score 8.3 out of 10
N/A
TIBCO Data Virtualization is an enterprise data virtualization solution that orchestrates access to multiple and varied data sources and delivers the datasets and IT-curated data services foundation for nearly any solution.
In its tool selection process, Cloetta used a Value for Money model on Excel, scoring the functions, performance, vendor qualifications, support, and consultancy required against the total cost of ownership for the first five years of usage. The company went with TIBCO® Data …
1.- Scenaries with poor sources of data is not recomended (Very bad ROI). The solution is for medium-big enterprises with a lot of sources of data and users. 2.- Bank and finance enviroment to integrate differente data form trading, Regulatory reports, decisions makers, fraud and financial crimes because in this kind of scenary the quality of data is the base of the business. 3.- Departments of development and test of applications in enterprises because you can design enviroments, out of the production systems, to development and test the new API's or updateds made.
TIBCO Data Virtualization is well suited for customers who are challenged to deal with extracting data from dozens of different sources and systems, and do not have the time and liberty to hire data engineers and/or ETL developers to write dozens or hundreds of complex ETLs. However, there are situations where TIBCO Data Virtualization severely underperforms, and those are where we are dealing with large volumes of data, in tera bytes or peta byte scale system. For example, a messaging queue which sends 200 million messages every hour will choke TIBCO Data Virtualization if the technology is chosen to route the data.
Informatica Powercenter is an innovative software that works with ETL-type data integration. Connectivity to almost all the database systems.
Great documentation and customer support.
It has a various solution to address data quality issues. data masking, data virtualization. It has various supporting tools or MDM, IDQ, Analyst, BigData which can be used to analyze data and correct it.
There are too many ways to perform the same or similar functions which in turn makes it challenging to trace what a workflow is doing and at which point (ex. sessions can be designed as static or re-usable and the override can occur at the session or workflow, or both which can be counter productive and confusing when troubleshooting).
The power in structured design is a double edged sword. Simple tasks for a POC can become cumbersome. Ex. if you want to move some data to test a process, you first have to create your sources by importing them which means an ODBC connection or similar will need to be configured, you in turn have to develop your targets and all of the essential building blocks before being able to begin actual development. While I am on sources and targets, I think of a table definition as just that and find it counter intuitive to have to design a table as both a source and target and manage them as different objects. It would be more intuitive to have a table definition and its source/target properties defined by where you drag and drop it in the mapping.
There are no checkpoints or data viewer type functions without designing an entire mapping and workflow. If you would like to simply run a job up to a point and check the throughput, an entire mapping needs to be completed and you would workaround this by creating a flat file target.
Performance of TDV repository database is rather poor for larger numbers of objects .(Note: We have approx. 9tsd objects introspected in TDV and approx. 20tsd objects generated in upper DV layers.)
Propagation of privileges to parent/child dependencies does not work when applying recursively on a folder. (It's a huge setback when working with large number of objects organized semantically into subfolders.)
Lack of command line client interface for scripting at the time of version 8.4 (I had to write my own CLI.)
TDV Studio does an absolutely horrible job with its own code editors when indentation is in place. Also, the editor is brutally slow and feature-poor.
Tracking privileges on the level of table/view columns causes occasional problems when regranting.
TDV's stored programs ("SQL scripts" in their own terminology) compiler leaves out many syntactic and semantic checks, making them hugely prone to run-time errors.
TDV Server's REST API is a very poor (in terms of features) and flawed cousin to its SOAP API (at the time of version 8.4).
Positives; - Multi User Development Environment - Speed of transformation - Seamless integration between other Informatica products. Negatives; - There should be less windows to maintain developers' focus while using. You probably need 2 big monitors when you start development with Informatica Power Center. - Oracle Analytical functions should be natively used. - E-LT support as well as ETL support.
TDV's interface is a bit dated and not entirely intuitive. Would recommend some UX design review as the interface leaves a bit to be better understood to be used by users without inherent knowledge of Tibco. Overall I'd suggest more improvement here to ensure usability by a lesser tech audience.
PowerCenter is robust and fast, and it does a great job meeting all the needs, not just the most commercially vocal needs. In the hands of an expert power user, you can accomplish almost anything with your data. It is not for new users or intermittent users-- for that the Cloud version is a better fit. Be prepared for costly connectors (priced differently for each source or destination you are working with), and just be planful of your projects so you are not paying for connectors you no longer need or want
This product's performance is very consistent. It is extremely rare for templates to fail. I've been using this software for 5 years and find it to be both simple and powerful. The impact within the company has been very positive as different processes in different areas, such as data analysis, development, and integrations, have been improved, and, best of all, it has not affected the users. Various systems with which it is connected in order to obtain information.
Informatica power center is a leader of the pack of ETL tools and has some great abilities that make it stand out from other ETL tools. It has been a great partner to its clients over a long time so it's definitely dependable. With all the great things about Informatica, it has a bit of tech burden that should be addressed to make it more nimble, reduce the learning curve for new developers, provide better connectivity with visualization tools.
On a few occasions I have asked TIBCO technical support for help because I have adapted perfectly to their tools, but in those few that I have communicated with their technical team I have received personalized, attentive, responsible attention and I am always assisted by an expert staff the topic. A TIBCO technical support technician spent more than an hour helping me to solve a problem in the initial stage of implementation in my department and this is something that I always appreciate.
The training was helpful. I was able to understand how to use TIBCO for the data load process that we implemented and how to perform various troubleshooting steps based on the training I received. The technician was thorough and took the time to answer any questions. Once we were shown how to use TIBCO in the test environment, we were able to configure the production environment ourselves.
Other vendors have clearer, more visual implementation documentation. We also did not have our data architect and and server administrator available full-time for implementation. In the future, we will secure the necessary internal resources.
While Talend offers a much more comfortable interface to work with, Informatica's forte is performance. And on that front, Informatica Enterprise Data Integration certainly leaves Talend in the dust. For a more back-end-centric use case, Informatica is certainly the ETL tool of choice. On the other hand, if business users would be using the tool, then Talend would be the preferred tool.
We did not need to evaluate another technology in the same category for data virtualization, since we are 100% sure of the capabilities and benefits that we would have with TIBCO Data Virtualization, both for market positioning as well as success stories from other companies. great renown worldwide. From the first day of use, it meets our needs to provide the expected solutions.
The data pipeline automation capability of Informatica means that few resources are needed to pre-process the data that ultimately resides in a Data Warehouse. Once a workflow is implemented, manual intervention is not needed.
PowerCenter did require more resources and time for installation and configuration than was expected/planned for.
The lack of or minimal support of unstructured data means that newer sources of dynamic/changing data cannot be easily processed/transformed through PowerCenter workflows.