Amazon S3 is a cloud-based object storage service from Amazon Web Services. It's key features are storage management and monitoring, access management and security, data querying, and data transfer.
N/A
Dell Avamar
Score 5.5 out of 10
N/A
Dell Avamar is a hardware and software data backup and deduplication product. It provides protection and recovery through a complete software and hardware solution when paired with Dell Data Domain for virtual environments, remote offices, enterprise apps, NAS servers, and desktops/laptops.
N/A
Pricing
Amazon S3 (Simple Storage Service)
Dell Avamar
Editions & Modules
No answers on this topic
No answers on this topic
Offerings
Pricing Offerings
Amazon S3
Dell Avamar
Free Trial
No
No
Free/Freemium Version
No
No
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
Optional
Additional Details
—
—
More Pricing Information
Community Pulse
Amazon S3 (Simple Storage Service)
Dell Avamar
Features
Amazon S3 (Simple Storage Service)
Dell Avamar
Data Center Backup
Comparison of Data Center Backup features of Product A and Product B
Amazon S3 (Simple Storage Service)
8.8
11 Ratings
2% above category average
Dell Avamar
6.6
23 Ratings
25% below category average
Universal recovery
8.710 Ratings
8.021 Ratings
Instant recovery
8.210 Ratings
8.020 Ratings
Recovery verification
8.37 Ratings
7.018 Ratings
Business application protection
8.57 Ratings
8.018 Ratings
Multiple backup destinations
8.710 Ratings
7.021 Ratings
Incremental backup identification
9.24 Ratings
8.022 Ratings
Backup to the cloud
8.911 Ratings
3.610 Ratings
Deduplication and file compression
8.85 Ratings
8.023 Ratings
Snapshots
9.17 Ratings
5.017 Ratings
Flexible deployment
9.111 Ratings
4.013 Ratings
Management dashboard
7.910 Ratings
2.014 Ratings
Platform support
8.710 Ratings
8.015 Ratings
Retention options
9.57 Ratings
8.016 Ratings
Encryption
9.78 Ratings
8.015 Ratings
Enterprise Backup
Comparison of Enterprise Backup features of Product A and Product B
Amazon S3 is a great service to safely backup your data where redundancy is guaranteed and the cost is fair. We use Amazon S3 for data that we backup and hope we never need to access but in the case of a catastrophic or even small slip of the finger with the delete command we know our data and our client's data is safely backed up by Amazon S3. Transferring data into Amazon S3 is free but transferring data out has an associated, albeit low, cost per GB. This needs to be kept in mind if you plan on transferring out a lot of data frequently. There may be other cost effective options although Amazon S3 prices are really low per GB. Transferring 150TB would cost approximately $50 per month.
This software is well suited for companies that want to be very in control of their backups but need a simple tool. It could be convenient for them to just buy their own node, locate it in a different location, and set up the jobs for their machines to backup to the cloud with the specific plugins. However, it would not be convenient for companies looking to have a tool that needs to be secure and compliant, and that need to have different other options for business continuity.
Fantastic developer API, including AWS command line and library utilities.
Strong integration with the AWS ecosystem, especially with regards to access permissions.
It's astoundingly stable- you can trust it'll stay online and available for anywhere in the world.
Its static website hosting feature is a hidden gem-- it provides perhaps the cheapest, most stable, most high-performing static web hosting available in PaaS.
Avamar performs data deduplication on the remote host. This greatly reduces the amount of traffic that each backup requires. This even applies to the virtual environment through change block tracking. Backup times are reduced from hours to minutes.
The management interface makes it easy to configure and maintain data retention periods. Many times certain data must be kept for an extended period of time. There is a specific menu for managing retention periods.
The system is able to recover itself from a hard failure with virtually no loss of backups. There is a checkpoint taken each day that provides a recovery point in the event of a catastrophic failure. Since this is a node based system, the loss of more than one node could require a recovery be performed.
While another grid must be purchased, the replication utility allows all backups to be replicated to another grid at a remote location. This ensures the resilience of the backups in the event there is the loss of the primary data center.
Also works on HCI devices performing image-level backups as in our primary data center environment
There is also now an All-in-One appliance for smaller locations
Web console can be very confusing and challenging to use, especially for new users
Bucket policies are very flexible, but the composability of the security rules can be very confusing to get right, often leading to security rules in use on buckets other than what you believe they are
The client interface has constant JAVA issues and can be slow and chunky. We have often had issues with current versions of JAVA breaking it so it will not even run.
The backup clients are split out for function. Although this makes them light, it also makes it cumbersome to upgrade clients. The naming scheme can also be confusing for the clients.
I have been using the product for over five years. This has performed so well that with the current system reaching its End-of-Life with EMC next year, I have proposed replacing it with the latest version of the product. Now that it integrates with Data Domain, the cost has been greatly reduced. Instead of the need to purchase many nodes, one Data Domain can replace them creating a significant cost savings.
It is tricky to get it all set up correctly with policies and getting the IAM settings right. There is also a lot of lifecycle config you can do in terms of moving data to cold/glacier storage. It is also not to be confused with being a OneDrive or SharePoint replacement, they each have their own place in our environment, and S3 is used more by the IT team and accessed by our PHP applications. It is not necessarily used by an average everyday user for storing their pictures or documents, etc.
The system overall is easy to monitor and see your backup/restore status. The user interface could use updating as it relies on Java and any updates to Java cause the interface to stop working need to be reinstalled
AWS has always been quick to resolve any support ticket raised. S3 is no exception. We have only ever used it once to get a clarification regarding the costs involved when data is transferred between S3 and other AWS services or the public internet. We got a response from AWS support team within a day.
Support is very convincing, always eager to solve issues from the root rather than workaround, don't hesitate to take webex, describe the issues to the core and recommend configuration to avoid further issues. We can ask few questions other than the main issue. They don't hesitate to answer.
Overall, we found that Amazon S3 provided a lot of backend features Google Cloud Storage (GCS) simply couldn't compare to. GCS was way more expensive and really did not live up to it. In terms of setup, Google Cloud Storage may have Amazon S3 beat, however, as it is more of a pseudo advanced version of Google Drive, that was not a hard feat for it to achieve. Overall, evaluating GCS, in comparison to S3, was an utter disappointment.
Avamar has simplied the back up approach in their VE edition and is much easier to use than Data Protector. Backing up multiple VMs takes minutes instead of hours now. Creating policies, retentions, and schedules, is vastly improved and much easier.
It practically eliminated some real heavy storage servers from our premises and reduced maintenance cost.
The excellent durability and reliability make sure the return of money you invested in.
If the objects which are not active or stale, one needs to remove them. Those objects keep adding cost to each billing cycle. If you are handling a really big infrastructure, sometimes this creates quite a huge bill for preserving un-necessary objects/documents.