Oracle Data Integrator is an ELT data integrator designed with interoperability other Oracle programs. The program focuses on a high-performance capacity to support Big Data use within Oracle.
N/A
Oracle Warehouse Builder
Score 8.7 out of 10
N/A
Oracle Warehouse Builder (OWB) is a data-warehousing centered data integration solution, from Oracle. It offers basic ETL functionality for building a simple data warehouse, as well as advanced ETL functionality supporting enterprise data integration projects, along with connectivity for Oracle and SAP applications.
N/A
HPE Zerto Software
Score 8.9 out of 10
Enterprise companies (1,001+ employees)
HPE Zerto Software aims to enable customers to run an always-on business by simplifying the protection, recovery, and mobility of on-premises and cloud applications.
N/A
Pricing
Oracle Data Integrator (ODI)
Oracle Warehouse Builder
HPE Zerto Software
Editions & Modules
No answers on this topic
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
Oracle Data Integrator (ODI)
Oracle Warehouse Builder
HPE Zerto Software
Free Trial
No
No
Yes
Free/Freemium Version
No
No
No
Premium Consulting/Integration Services
No
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Optional
Additional Details
—
—
—
More Pricing Information
Community Pulse
Oracle Data Integrator (ODI)
Oracle Warehouse Builder
HPE Zerto Software
Considered Multiple Products
Oracle Data Integrator (ODI)
Verified User
Consultant
Chose Oracle Data Integrator (ODI)
Oracle's own ETL tool was Oracle Warehouse Builder, initially. When Oracle built the Oracle Business Intelligence Applications Suite, Oracle is in need of a strong ETL. As Oracle Warehouse Builder is not a strong ETL that customers prefer and as already Informatica captured …
ODI is the naturel successor of OWB, adopting the same EL-T approach but supporting a lot more technologies as source and target. The overall product is much more stable and not tied to the Oracle Database. Unlike Informatica, ODI generates all the code in the native underlying …
Depending on investment, I would prefer Oracle Data Integrator, since it complies with all versions of all types of databases including Big Data, Hadoop and NoSQL environment as well as the cloud and is the strategic heterogenous data integration product of Oracle, as it is …
I think that Oracle Warehouse Builder is easier to use the Oracle Data Integrator (ODI). The connections in Oracle Warehouse Builder (OWB) are easier to understand when troubleshooting. Anything that makes troubleshooting easier gets higher marks in my book. ODI wins in the …
Oracle Data Integrator is well suited in all the situations where you need to integrate data from and to different systems/technologies/environments or to schedule some tasks. I've used it on Oracle Database (Data Warehouses or Data Marts), with great loading and transforming performances to accomplish any kind of relational task. This is true for all Oracle applications (like Hyperion Planning, Hyperion Essbase, Hyperion Financial Management, and so on). I've also used it to manage files on different operating systems, to execute procedures in various languages and to read and write data from and to non-Oracle technologies, and I can confirm that its performances have always been very good. It can become less appropriate depending on the expenses that can be afforded by the customer since its license costs are quite high.
The best place for Oracle Warehouse Builder is at the business IT level. It's not suited for business-level users. They are easy confused. One way to reduce the confusion for the developers is to set up the workspaces based on the requirements that are discovered in design sessions. Once this is complete, the implementation of Oracle Warehouse Builder can take flight and be successful.
Zerto is well suited for disaster recovery and virtual machine replication between multiple data centers. DR testing for audit or regulations is much easier with Zerto, great reporting, dashboard etc. It is not well suited for physical server replication for disaster recovery or as a primary backup solution.
Oracle Data Integrator nearly addresses every data issue that one can expect. Oracle Data Integrator is tightly integrated to the Oracle Suite of products. This is one of the major strengths of Oracle Data Integrator. Oracle Data Integrator is part of the Oracle Business Intelligence Applications Suite - which is highly used by various industries. This tool replaced Informatica ETL in Oracle Business Intelligence Applications Suite.
Oracle Data Integrator comes with many pre-written data packages. If one has to load data from Excel to Oracle Database, there is a package that is ready available for them - cutting down lot of effort on writing the code. Similarly, there are packages for Oracle to SQL, SQL to Oracle and all other possible combinations. Developers love this feature.
Oracle Data Integrator relies highly on the database for processing. This is actually an ELT tool rather than an ETL tool. It first loads all the data into target instance and then transforms it at the expense of database resources. This light footprint makes this tool very special.
The other major advantage of Oracle Data Integrator, like any other Oracle products, is a readily available developer pool. As all Oracle products are free to download for demo environments, many organizations prefer to play around with a product before purchasing it. Also, Oracle support and community is a big advantage compared to other vendors.
Anyone with a large disk (VMDK) knows the issues of VMware snapshots. Most backup software is a "point in time backup" that uses snapshots. While the backup can be run multiple times per day the stress of the snapshot on the host and storage is eliminated by the continuous protection of Zerto log replication.
A client had a the disks on a VM go missing for some reason. We had them "flip the switch" for a real fail over and press the fail over button. The VM on our DR site started to come alive as the VM at the customer site was brought down. When the DR VM was fully up, automatic reverse replication started. The DR machine was available in a few minutes (to take into account different host hardware) for access. One the vm at both sites were in sync, we had the customer again repeat the fail over process and the DR site VM was turned off and the Production site VM was brought back on line. This was a 200 GB VM and the whole process was finished in about 3 hours.
Zerto also allows for "Test" fail overs that can be configured on many different functions, such as host, datastore, network and IP usage. Configuring the IPs is crucial to avoid inadvertent site cross contamination of the same VM.
Zerto can also retrieve files from any VM disk on the DR site without starting a VM. Very handy for retrieving files or directories.
Since Zerto is running continuous log replication, changes on the production VM are nearly instantaneously copied to the DR site. As with any data process, having sufficient bandwidth for "churn" peaks minimizes the delay in updating the DR site.
ODI does not have an intuitive user interface. It is powerful, but difficult to figure out at first. There is a significant learning curve between usability, proficiency, and mastery of the tool.
ODI contains some frustrating bugs. It is Java based and has some caching issues, often requiring you to restart the program before you see your code changes stick.
ODI does not have a strong versioning process. It is not intuitive to keep an up to date repository of versioned code packages. This can create versioning issues between environments if you do not have a strong external code versioning process.
What I noticed is that sometimes OWB doesn't generate the best SQL in the package especially when there are a high number of source tables in the ETL. It would be nice if ETL developers were allowed to update the generated packages in the database directly.
Another thing - moving OWB ETLs from one database to another one could be easier - for example it would be nice to just copy the generated packages from one database to the other one without doing the deployment of these ETLs through OWB.
It is maturing and over time will have a good pool of resources. Each new version has addressed the issues of the previous ones. Its getting better and bigger.
We really like the easy setup of this replication solution, as well as the ease of management. Not to mention, our internal IT Economist determined that the Zerto solution would provide the best ROI out of the competing solutions we analyzed. So far, his calculations have been spot on, and we have saved substantially
Zerto is very easy to implement and support. Uses are broad, only issues are once something doesn't sync it is difficult to get assistance until your reach tier 2 or tier 3 support. Basic file and folder recovery is great. Live and test fail overs are also easy to implement without issue.
Overall support is very good. We sometimes get pushback when asking Level 1 support to escalate to Level 2. This causes undue frustrations when you need a more knowledgeable support person to get involved. We've had to escalate to account reps a few times for this scenario. Zerto is very responsive and normally handles our requests very quickly.
I have used Trifacta Google Data Prep quite a bit. We use Google Cloud Platform across our organization. The tools are very comparable in what they offer. I would say Data Prep has a slight edge in usability and a cleaner UI, but both of the tools have comparable toolsets.
We started out using Backup Exec which was in service until we virtualized our environment where it didn't perform as well at the time. Then we switched to Veeam which worked well, but then as we started needing to do migrations and off-site DR, we found ourselves relying on Zerto more often.
For my organization, the pricing model was an upfront investment for the Zerto licenses. My organization prefers to pay upfront and not deal with month-to-month or year-to-year pricing models that most companies are moving to. But for some, the investment may be more than they can afford, and would prefer the year-to-year pricing model.
I mean, it was 6 years ago, but we were up and going with all applications synchronizing in short order. The longest tasks was getting the 30 TB of application data synchronized between the datacenters.
Zerto is like having the best possible insurance ... it just works, and often provides the backups taken overnight that are key in recovering data/work between overnight backups.
Zerto easily enabled the move of primary datacenters by allowing easy failover to a secondary site, and failback to the primary site.