The Amazon S3 Glacier storage classes are purpose-built for data archiving, providing a low cost archive storage in the cloud. According to AWS, S3 Glacier storage classes provide virtually unlimited scalability and are designed for 99.999999999% (11 nines) of data durability, and they provide fast access to archive data and low cost.
$0
Per GB Per Month
Microsoft System Center
Score 8.2 out of 10
N/A
Microsoft System Center Suite is a family of IT management software for network monitoring, updating and patching, endpoint protection with anti-malware, data protection and backup, ITIL- structured IT service management, remote administration and more.
It is available in two editions: standard and datacenter. Datacenter provides unlimited virtualization for high density private clouds, while standard is for lightly or non-virtualized private cloud workloads.
If your organization has a lot of archival data that it needs to be backed up for safekeeping, where it won't be touched except in a dire emergency, Amazon Glacier is perfect. In our case, we had a client that generates many TB of video and photo data at annual events and wanted to retain ALL of it, pre- and post- edit for potential use in a future museum. Using the Snowball device, we were able to move hundreds of TB of existing media data that was previously housed on multiple Thunderbolt drives, external RAIDs, etc, in an organized manner, to Amazon Glacier. Then, we were able to setup CloudBerry Backup on their production computers to continually backup any new media that they generated during their annual events.
We used a product before that was designed to prevent users making changes and saving files to the desktop computer. This required a renewal of the license. By using SCCM in our environment we were able to discontinue using that product because SCCM allows us to completely restore a machine back to the original configuration. We have taught our users to save their individual work on either a network drive or a cloud drive. By doing this, if we do a re-image of their machine they have lost no data, and it makes for a faster resolution. In some instances having a computer in our SCCM environment it can become cumbersome when creating new users for very specific purposes. It can be done by creating new organizational units and applying new policies but when in a pinch it can be frustrating. For the most part we have tried to make "new" purpose images and groups to at least accommodate a quick install.
Provides our users the ability to deploy and manage our own datacenter based on defined software with understandable solutions for storage, compute, networking and security.
We are able to update at once all the computers from all departments without having to install the OS on every computer.
It allows us to have everything in one place for database management and datacenter inspection as well.
Accessing data stored in Glacier is slow. That shouldn't be a surprise, but it is undesirable nonetheless.
Retrieving a large amount of data can be expensive; Glacier's intended use is as an archive of rarely-accessed data.
Some users regard Glacier with fear and uncertainty. Slow retrieval time and high retrieval cost are the greatest risks of using Glacier, and they are also the Glacier interaction that most users have the least experience with.
Needs web based storefront for requesting new software
Needs ability to manage the packaging work flow better
Sometimes is slow to download and there is no indication the entire catalog is being loaded, resulting in confused users not being able to find common software in the available list.
It is not user-friendly for the most part. With IT infrastructure, sometimes it cannot handle excess requests. Every few months, you will need an upgrade in terms of server resources to keep up with incoming alerts and requests. This does not happen all of the time, but it does happen when there are too many requests.
If I had to dislike something about the system it would be how much it changes once you upgrade. This could be more of a problem of mine since I get used to one way and don't like it when it changes so much. I am enjoying the newest update, but it is a mess when you are actually going through the upgrades.
Since the rest of our infrastructure is in Amazon AWS, coding for sending data to Glacier just makes sense. The others are great as well, for their specific needs and uses, but having *another* third-party software to manage, be billed for, and learn/utilize can be costly in money and time.
We previously used a mix of FOG and Clonezilla to image machines. The biggest issues with these products is that changing one piece of the image required you to rebuild the entire image itself. These pieces of software also did not allow you to manage applications and Windows Updates, causing IT to have to constantly touch machines after they were imaged and update or manage them with a much more hands on approach.
We have been able to automate our patch management, firmware and other security concerns.
We have a standardized "image" ensuring our setup is consistent across the enterprise. This alone has saved us in time to support and time to understand how to use our desktops.