Reviews (1-15 of 15)
- Converts data from various sources into one target format using various business logic rules and integrates with various DBMS types.
- Transformed data from DB2, SQL Server and other Oracle databases into flat files and then used ETL jobs to load into Oracle DB target
- Data Integrator and Goldengate were used together to accomplish the data movement needed for business and data consolidation in live environment. Data Integrator helped with development and in reducing lead time to convert data into target.
- Still complex product to use for ETL developers. Needed lot of training and testing before it could be implemented in production data conversion.
- Initial set up was resource intensive on machine and manpower used on this project.
- Better GUI interface with less options should be designed for Big data integration tool.
ODI helped me to reduce a slowly changing dimension type 2 with the same output, from 22,000 seconds to 168 seconds. [I] Manage to load 10.000+ files to a table from 1000+ different sources under 20 minutes with approximate 300 GB of data per day.
- ODI is a tool that can talk or learn how to talk, with any database or operating system in its own language, this is the power of ODI!
- I managed to connect to an Ingress Database, within 3 days of time (it is not supported out of the box).
- When a new version of source and/or target database supports new data types, it takes my 5 minutes to implement it into ODI and immediately start using it.
- Flexibility, ease of customization, extensive features, ease of deployment, and the ability to access to all kinds of different source system technologies. No need for extra hardware for transformation step.
- Easy to learn & develop. It takes your 3 days to learn ODI and be an "intermediate ODI Developer", if you know how to write SQL.
- Big data connectors are implemented since ODI 188.8.131.52 (out-of-the-box) and upper version that support many well-known Big Data architecture.
- Knowledge Module architecture helps you to build your data integration activities with less effort.
- They need to work on the multiuser development environment and include the ability to comply with different kinds of SDLCs.
- You can switch to source, staging area or target to improve your querying performance. If you have to do a join from different source systems, you can decide which data to move to where and figure out the place for best output.
- Variables can help you to perform loops and conditional statements in packages for helping ETL. What else do you need?
- It helped me to reduce a slowly changing dimensions type 2 with the same output, from 22,000 seconds to 168 seconds.
- Loaded 10.000+ files to a table from 1000+ different sources under 20 minutes with approximate 300GB of data per day.
- It supports all platforms, hardware and OSs with the same software.
- It utilizes the source and target server to perform complex transformations without the need of ETL server sitting between source and target server which makes architecture more simple and efficient.
- Data Connectivity is very good as it supports all RDBMSs.
- When you develop on ODI studio, you need to restart the program as it occupies a lot of RAM.
- It is not meant for real time data integration.
- It is hard to maintain the security setup.
In my organization, my department develops business intelligence solutions for our worldwide customers, whose business can be in any kind of sector. In this scenario, ODI is being used to integrate data among all the pieces of the software architecture, connecting Oracle and non-Oracle technologies.
In my 6 years experience, I used ODI to:
- feed up relational environments, like Data Warehouses or Data Marts models with reporting purposes
- feed up Hyperion Planning, Hyperion Essbase, Hyperion Financial Management or INFOR applications, for our customers' controllers
- read data from Oracle and non-Oracle technologies (i.e.: SAP, SQL Server, INFOR, and many other), thanks to all ODI's connectors and interpreter
- schedule tasks and workflow, independently from the operating system, thanks to ODI's agent (the service that controls all the tool's tasks).
ODI helps integrating data across different sources and targets, executing operating systems command, managing mistakes or discards, logging all the operations performed by the integration flows and scheduling those flows.
- Is simple and easy to learn, thanks to the low number of development objects that it has (mainly: interfaces, procedures and packages, plus variables).
- It allows you to integrate data from and to any kind of technology. This is a strong point since it's able to connect practically to any technology, Oracle and not.
- It allows you to create integration flows that manage through steps on any task.
- It provides both an automatic and a customizable management of integration mistakes.
- If you feel stronger in developing through hard code, ODI allows you to integrate anything through procedures.
- It's easy to backup, since it provides a native way to export all the developments but also its relational repositories (master and work) can be exported from the database where they reside.
- The newest version (ODI 12c) has been released with some minor bug related to environment stability (i.e. connection loss).
- After some time that you develop on ODI studio, you need to restart the program, since it tends to occupy a lot of RAM.
- The ODI studio may need a machine with a lot of resources, otherwise it may become slow.
Oracle Data Integrator is well suited in all the situations where you need to integrate data from and to different systems/technologies/environments or to schedule some tasks. I've used it on Oracle Database (Data Warehouses or Data Marts), with great loading and transforming performances to accomplish any kind of relational task. This is true for all Oracle applications (like Hyperion Planning, Hyperion Essbase, Hyperion Financial Management, and so on).
I've also used it to manage files on different operating systems, to execute procedures in various languages and to read and write data from and to non-Oracle technologies, and I can confirm that its performances have always been very good.
It can become less appropriate depending on the expenses that can be afforded by the customer since its license costs are quite high.
ODI: Flexibility, reusability and performance for data integration with various source and target technologies.
- The EL-T approach that will first load into the target dataserver before doing the transfer is a great architecture improvement compared to standard ETL tools that use a staging area and usually process the data in Java. With ODI, almost all the job is pushed down on the underlying technology, for instance the Oracle Database or the Spark server.
- The Knowledge Module approach provides an easy and reusable way to create our own integration strategies. It's easy to create these Knowledge Modules to connect to new technologies, for instance.
- ODI is really the tool for any kind of integration because it speaks the language of the technology we connect. We can work with RDBMS but also in Hadoop, cloud services, flat files, web services, etc.
- Continuous integration is missing and would be a really nice feature to enforce a good development lifecycle.
- Better handling of files and folders, to be able to easily go through all the files of a folder.
- Security setup is not easy to maintain.
- It's fast. If you have used OWB before, you will notice how much faster ODI is.
- The Knowledge Modules (KMs) are extremely useful.
- Integrates really well with the rest of the Oracle Technology Stack (GoldenGate, WebLogic, OEM, and, of course, the Oracle Enterprise Database).
- In ODI 12c, Oracle added a migration tool for OWB clients. It works, but not as well as we thought. We knew we would have to recreate the process flow - but then some of our OWB scripts didn't migrate well - and we ended up spending a lot more time than we initially thought [while] going through the entire migration.
- The installation isn't trivial. It took us a while to figure out which components need to reside where, how WebLogic fits in, and how to set it all up.
- The Oracle training for ODI is outrageously expensive - over 4K for a 5-day class.
- For us, it is particularly good for batch loading processes, because it is quite easy to connect different sources.
- Great for ETL processes, since it allows you to extract data from very different sources, perform some transformation, and load it to your final system.
- It was not difficult for our technical team to learn and adopt the tool.
- The main problem for us is that, from time to time, and for reasons we still were unable to discover, ODI loses the connection, and integration processes fail.
- I think there’s some room to improve the controlling of this process, but I don’t think they will perfect ODI in this way because Oracle has other licensed tools.
Oracle Data Integrator (ODI) is great for integration between different Oracle data bases, and it also works well taking data from sources like Excel or SQL Server.
We’re not so sure about ODI’s performance when origins are “very far away”, for instance, in different servers connected through an MPLS, or between one “on premise” server with another “cloud server”. But perhaps it is due to lack of knowledge.
- Oracle Data Integrator is a very powerful tool. The graphical user interface simplifies the generation of complex SQL statements which can be used to extract and transform large amounts of data.
- ODI allows users to structure and schedule packages of code. It allows you to combine data extraction and transformation sequences based on business area, relationships, or whichever design technique best suits your organization.
- ODI is able to provide detailed logging information and send out alerts via email, simplifying the process of monitoring and debugging issues.
- ODI does not have an intuitive user interface. It is powerful, but difficult to figure out at first. There is a significant learning curve between usability, proficiency, and mastery of the tool.
- ODI contains some frustrating bugs. It is Java based and has some caching issues, often requiring you to restart the program before you see your code changes stick.
- ODI does not have a strong versioning process. It is not intuitive to keep an up to date repository of versioned code packages. This can create versioning issues between environments if you do not have a strong external code versioning process.
The great quality of ODI, is that it is able to connect to any database and source (xml , txt, SOZ ) both source and target. ODI is all possible, we have not seen in the position of a customer asks us something that ODI can not do . Thanks to that we can modify its core, we can do anything.
ODI allows the multidisciplinary development, its own versioning, and integration with any language (Python , Groovy , t - SQL).
- It connects to any source (database, files, web services) and target.
- Modify the logic of SQL with knowledge modules.
- Easy development.
- The product sometimes have small bugs, which are fixed with patches.
- Very complex queries are difficult to do in graphics mappings
- It has functionality versions but can not develop in parallel a same object.
- Has many different source and destination connectors
- Manages workflow for data movement
- Many options in the data transformation toolbox
- Error handling is quite complicated
- Very cumbersome user interface
- Too much complexity makes easy jobs harder to maintain
ODI - not necessarily an enterprise solution, but good enough for smaller financial systems implementations
- Transforms and loads large amounts of data (gigabyte and up).
- Stores complex transformation logic in an understandable way that is easily updated after the fact.
- Managed multiple ETL user groups/credentials to segment data availability and execution pathways.
- ODI has a very arduous process for migrating components from one environment to another. I've found it to be error prone unless migrating a full schema which is not always a viable option.
- ODI has no web-based administration panel, all admin tasks must be handled via hard install.
- The scheduler tool is difficult to initially set up and not intuitive to manage. Takes a while to stand up properly.
- Oracle Data Integrator nearly addresses every data issue that one can expect. Oracle Data Integrator is tightly integrated to the Oracle Suite of products. This is one of the major strengths of Oracle Data Integrator. Oracle Data Integrator is part of the Oracle Business Intelligence Applications Suite - which is highly used by various industries. This tool replaced Informatica ETL in Oracle Business Intelligence Applications Suite.
- Oracle Data Integrator comes with many pre-written data packages. If one has to load data from Excel to Oracle Database, there is a package that is ready available for them - cutting down lot of effort on writing the code. Similarly, there are packages for Oracle to SQL, SQL to Oracle and all other possible combinations. Developers love this feature.
- Oracle Data Integrator relies highly on the database for processing. This is actually an ELT tool rather than an ETL tool. It first loads all the data into target instance and then transforms it at the expense of database resources. This light footprint makes this tool very special.
- The other major advantage of Oracle Data Integrator, like any other Oracle products, is a readily available developer pool. As all Oracle products are free to download for demo environments, many organizations prefer to play around with a product before purchasing it. Also, Oracle support and community is a big advantage compared to other vendors.
- The Java framework does not stand up to the mark. It crashes frequently. There is a long [list] of improvements required in terms of client tools framework.
- Though Oracle Data Integrator lets people login from a web browser to monitor different activities, it crashes and performs very poorly when logged in out of network. Even Oracle Data Integrator client tools don't perform well on a remote desktop. Increasing Java buffer is one of the workarounds, that might work. So, a better client tools and interface is a big advantage.
- Navigation is another pain area. It is very difficult to find out the precise error, in case of a failure, in Oracle Data Integrator. In Informatica, it is a cake walk to pull out a session log that shows up the exact reason for failure. Although Oracle Data Integrator is tightly integrated to database, it is very poor in providing the exact error at high level. One has to dig through to get it.
1. When you want to process structured data from different databases - Teradata, Exadata, DB2, SQL, Oracle etc.
2. Oracle Data Integrator supports all platforms, hardware, and OS. This is a major advantage compared to other leading tools.
3. The ELT architecture giving a cutting edge performance over leading ETL tools. There is no need to align Oracle Data Integrator between source and target. ODI uses the source and target servers to perform complex transformations.
4. Speeds up the development and maintenance by reducing the code that developers need to write.
- You need not worry about technology, it basically supports anything under the sun which has a JDBC or ODBC driver.
- Has enough key metrics (KMs) that come with the tool, that we can change and customize it as required to the organization.
- It supports batch loads to most databases which improves time and efficiency.
- Its not like other ETL tools where it resides on its own world. The transformations happen on the either the source or target and we can use what best suits the technology we are working on.
- It does not have all bells and whistles. Needs more integration with Mainframe.
- Currently it does not have any KMs that load binary files in batch mode to any database.
- There can be lot of change to the UI, which are normal stuff.
- Incremental data loads
- Complex Transformations
- Detailed documentation of LKM/IKM
- Troubleshooting of failed jobs
- More user friendly interface
- Hierarchy flattening transformations
- Integration issues with SAP HANA
- ELT functionality to improve the performance of the data load
- Able to work with different rdbms and file structures without any issue.
- Adaptability to learning the tool and implementing the projects required by the company.
- The GUI or ODI studio has Garbage Collector problems like freezing up if used for a long time without closing the studio.
- ODI 12c has got it, but 11g has a restriction in terms of loading data into multiple targets in the same interface.
Which objects do what in ODI?
How are objects migrated between different environments?
How are agents set up and what are their uses? How do you schedule scenarios?
What different types repositories does ODI use?
Oracle Data Integrator Scorecard Summary
About Oracle Data Integrator
Oracle Data Integrator Competitors
Oracle Data Integrator Technical Details