Skip to main content
Mirantis Kubernetes Engine

Mirantis Kubernetes Engine
Formerly Docker Enterprise


What is Mirantis Kubernetes Engine?

The Mirantis Kubernetes Engine (formerly Docker Enterprise, acquired by Mirantis in November 2019)aims to let users ship code faster. Mirantis Kubernetes Engine gives users one set of APIs and tools to deploy, manage, and observe secure-by-default, certified, batteries-included Kubernetes clusters…

Read more
Recent Reviews
Read all reviews
Return to navigation


View all pricing



per year



per year

Entry-level set up fee?

  • No setup fee
For the latest information on pricing, visit


  • Free Trial
  • Free/Freemium Version
  • Premium Consulting / Integration Services

Starting price (does not include set up fee)

  • $500 per year per node
Return to navigation

Product Details

What is Mirantis Kubernetes Engine?

Mirantis Kubernetes Engine Technical Details

Deployment TypesSoftware as a Service (SaaS), Cloud, or Web-Based
Operating SystemsUnspecified
Mobile ApplicationNo

Frequently Asked Questions

The Mirantis Kubernetes Engine (formerly Docker Enterprise, acquired by Mirantis in November 2019)aims to let users ship code faster. Mirantis Kubernetes Engine gives users one set of APIs and tools to deploy, manage, and observe secure-by-default, certified, batteries-included Kubernetes clusters on any infrastructure: public cloud, private cloud, or bare metal.

Mirantis Kubernetes Engine starts at $500.

Reviewers rate Support Rating highest, with a score of 7.8.

The most common users of Mirantis Kubernetes Engine are from Enterprises (1,001+ employees).
Return to navigation


View all alternatives
Return to navigation

Reviews and Ratings


Attribute Ratings


(1-25 of 37)
Companies can't remove reviews or game the system. Here's why
Fazeel Usmani | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
It is used in a project in our enterprise. It helps us to scale individual components independently. Only a little times we faced issues with Mirantis Cloud Native Suite (Docker Enterprise). Rest of the time I assure that it worked properly. We can also keep it on autoscale. And it works really well at peak times as well.
  • Scale small components individually
  • Scale Independent native components
  • Creating docker images
  • More support for high level components need to be improved
  • Skewed scaling of conponents can be improved
  • Offline support
Mirantis Cloud Native Suite (Docker Enterprise) is well suited for extracting docker images of large applications and test the application for various operating systems and all. It is well suited for business applications, customer applications etc. and not suited for financial applications because there may be security concerns so use it wisely
Anuj Rai | TrustRadius Reviewer
Score 6 out of 10
Vetted Review
Verified User
Docker Enterprise is quite a handy solution when it comes to containerizing your application making it lightweight and easy to spin and access. Currently, it is being used across the whole organization and [is a] solution for every kind of complex problem.
  • Easy to control.
  • Setting up network across different containers is quite easy.
  • Mapping of resources with host machine is easy.
  • Setting up networking from scratch is painful.
  • Resources required for setting Docker Enterprise are huge.
  • User interface needs to be improved and made more user friendly.
Mirantis solution is really helpful when your critical application is containerised and if you are facing any kind of problems related to containers. You don't have to rely on the community for your issue, you can raise a ticket with the vendor and the resolution is quite fast.
It [is] quite expensive when it comes to pricing and almost all the features can be utilized using the community edition which is free.

Carol Aleman | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
Through [Mirantis Cloud Native Suite (Docker Enterprise)] I have got rid of the improperly-signed and even unsigned container workloads. This tool is easy to configure and easily I can tag the workloads for execution on appropriate node types. Further through this application, it is very convenient for me to use specialized hardware in bare metal which includes FPGAs and GPUs, that offer great machine learning and helps in scientific computing.
  • It is advanced tool for balancing loads and managing routes.
  • It is easy to edit container contents.
  • Alerts are very useful which helps us handling the entire network.
  • I am mostly satisfied with all of its features but I have faced issues in the continuous data storage no doubt they offer features like Dockers Data Volumes but still there is much room for improvement.
  • Moreover, I am a happy user of this platform.
[Mirantis Cloud Native Suite (Docker Enterprise)] is the most advanced tool till now, which works as a VMs and separates any single application from the dependencies. Also, this tool is helping me in the agile development of the processes. It is strongly recommended to almost all major organizations.
March 15, 2021

Save space and time!

Score 9 out of 10
Vetted Review
Verified User
Docker is heavily used to containerized the projects and upload it to kubernetes. It is helpful when developing microservices. Due to Dockers isolation and portability feature, it is easy to deploy run and get a microservice up in no time. Docker is being used across the whole organization. Docker address following business problems: building independent microservices, isolation and easy potability.
  • Manage software application easily
  • Distribute apps within the team or organization
  • Saves space
  • Security is still a concern
  • Docker is difficult to use when using different operating system
  • Docker is an evolving technology which involves a learning curve
I would definitely recommend Docker to my colleagues if they are planning to build a microservice. The containers not only saves space but also time. The ease of portability helps to pass it among the team and helps them to get the setup ready in no time. It is a great way to save some developers repetitive task.
Score 7 out of 10
Vetted Review
Verified User
We use Swarm to deploy internal applications in a managed cluster. These applications are mission-critical and used across all departments in the organization. Swarm solves the problem of scaling and load balancing traffic and resources during peak business hours. It allows fast and simple automated deployment which drastically reduces the time spent on this task by developers.
  • Managing a cluster of micro-services.
  • Redundancy, fault-tolerance, and load balancing.
  • Small community - not a lot of information available outside of the documentation.
  • Logging - centralized logging for all applications is something we wish was available.
Swarm is well suited to deploy in an infrastructure of distributed micro-services and orchestrating them from a central platform. It does this job very well, and if you're looking for this functionality, I highly recommend Swarm. However, if you're simply looking for a load balancing and horizontal scaling, I think there are far better and even simpler tools available more suited to that task.
Score 10 out of 10
Vetted Review
Verified User
Docker is everywhere, there just isn't a server on an application which is not present in Docker. It forms the integral part of the whole infra for us. The beauty of Docker comes from its amazing quality of being robust, easy to start and very easy to blow it off completely. It's the most powerful tool which just does magic for us.
  • Robust.
  • Easy to setup.
  • The kernel cannot be changed.
For CI/CD it is the best tool to use. If you want to manage an infra where there are millions of machines needed you need to start using Docker if you are not.
August 14, 2019

Productivity Booster

Score 10 out of 10
Vetted Review
Verified User
Docker is used by most of our teams as part of their development and deployment practice. For development, it enables engineers to build applications in the same environment without worrying about local configuration issues. Nearly all of our CI/CD and infrastructure runs in Docker containers as well, which makes debugging production issues (especially around deployment) as simple as pulling down the right image and poking at it locally.
  • Isolation of dependencies.
  • "Black box" services like databases and packaged applications.
  • Infrastructure as code.
  • The CLI can take some getting used to if you aren't familiar with it.
  • For running many Docker containers locally, you'll need a lot of disk and RAM.
  • There are a lot of concepts to be familiar with when learning Docker, and the documentation could be more beginner-focused.
Docker is the best widely-used solution for isolated development environments and predictable deployments. However, for teams that work with only one specific technology stack, using Docker for local development can introduce more complexity. It shines when teams have to move between many different types of projects, but is "overkill" for a single application with minimal dependencies.
Matt James | TrustRadius Reviewer
Score 8 out of 10
Vetted Review
Verified User
We are currently using Docker in a test environment to deploy and monitor all of our servers/firewalls/switches/etc. throughout our company. We have a single server instance that houses all the containers and images. My department, the technology services department, is the only department that uses this and as it is still only being tested only one user is using/deploying/managing it—me. But it allows me to have a glance at each location to see if there are any issues that could potentially take down a site.
  • Usability is great after the initial setup.
  • Installation is a breeze.
  • The ability to knock down a container and rebuild it from scratch is fantastic.
  • It would be nice if Docker had its own frontend GUI.
  • The CLI is very difficult unless you have decent amount of Linux experience.
  • Stacks are still a mystery to me.
It would really all depend on what they are looking to do. We are planning on using it as a monitoring tool for our locations. There are tons of different ways Docker can be used so as it said, it depends on the use case. Not only do I use Docker for my company but I use it at home as well and there it is a beautiful and amazing tool for HTPC users, I just wish I had found it sooner.
July 13, 2019

Linux everywhere!

David Tanner | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
Docker is used by our company to build our server deployment files and to run tests. This allows us to have confidence that our deployments will work correctly in our pull request tests. Developers can also be confident that the build will run the same every time no matter where the code is being run.
  • The OSX management tool is simple to use.
  • It is nice to be able to use custom repositories.
  • The service runs mostly in the background now, and I don't have to tinker with it .
  • Sometimes issues arise running images, that are only cleared by removing the cache and restarting the OSX app.
  • It is easy to build up a lot of containers that aren't being used, and you have to manually clear them up.
  • It would be nice to have a better graphical interface to see what is going on internally.
Docker gives developers flexibility and repeatable outcomes. It is very useful for developing with confidence and knowing that all environments will behave the same. Not all developers like to use Linux for developing, so being able to run a Linux instance on Windows allows team members to develop on their OS of choice.
Score 7 out of 10
Vetted Review
Verified User
We use Docker for WordPress development. It has replaced Vagrant on our development systems. We like the tighter integration with Windows Hyper-V and the performance is better than VirtualBox-based solutions. We are able to jump-start many projects with the Docker Compose files people contribute and publish on GitHub. Using Docker Compose we are able to create reliable, consistent, development scaffolding.
  • High performance.
  • Easy to configure consistent development systems.
  • Eliminates the "It works on my machine" problem.
  • Tighter integration with Windows 10 and WSL.
  • Better support for MacOS.
  • Better documentation.
It's excellent when you need to run several simultaneous containers and has much better performance than Vagrant or VirtualBox based solutions. It's easier to configure than VMWare. It doesn't seem appropriate if you need GUI access inside the container, I'm not sure that's even possible. SSH access is possible, but a little cumbersome.
Score 9 out of 10
Vetted Review
Verified User
Instead of using VMs for our testing environments in our automated pipelines, we use Docker containers to simplify and increase the efficiency of our testing. We needed a testing environment that worked both for Windows and Linux, so Docker was the best choice for our scenario. It is being used on a team-by-team basis.
  • Containers - Docker is the go-to when using Containers, which are super useful if you need an environment that works both for Windows and Linux
  • Efficiency - Docker is very lightweight and doesn't demand too much from your CPU or server
  • CI/CD - Docker is excellent for plumbing into your build pipeline. It integrates nicely, is reliable, and has an easy set up.
  • Security - Since there's no true operating system, you're pretty limited when it comes to security in Docker. But that's with all containers.
  • Not totally isolated - Docker containers all use the same kernel, so if you've got multiple Docker containers up on one server, you could run into some issues.
  • Network connectivity - There's a fine line between limiting network access but also having proper communication where needed, since you don't have a full OS with Docker
Docker is great for when you would want to use a VM for any given application, but don't need the overhead of the whole OS. Docker containers use very little computing resources, boot up very quickly, and are very easy to set up. An instance where Docker may not be appropriate would be for an application that requires good security. If in this situation, a true VM would probably be your best bet.
Ben Lachman | TrustRadius Reviewer
Score 7 out of 10
Vetted Review
Verified User
We use Docker as part of a rapid deployment project that allows a service to be easily deployed directly onto VMs automatically during staging and production. It makes the management of the VM a parallel task to the deploy process. Traditionally the provision of a VM would be intertwined with the deploy process and containerization allows for these things to be decoupled.
  • Containerization - allowing multiple micro-services to function together without in-depth orchestration at the VM level.
  • Rapid deployment - a developer with appropriate access can simply push to the correct remote and the deploy happens automatically from there
  • Decouples provisioning from VM administration - allows containers to be deployed (more) regardless of VM set up.
  • Containers are often opaque - if a container doesn't work out of the box, it's messy to fix.
  • Logging is complexified by the multiple containers and logs are often not piped to places you expect them to be.
  • Networking is complexified due to internal port mapping between containers, etc.
Docker is great for staging and quickly deploying small to medium projects. With larger projects, it can become a significant challenge to manage all of the containers used for multiple microservices, keeping them up to date, secure and portable to other platforms. One of the goals of Docker is to allow the macro service to be platform agnostic and this can sometimes be more of a challenge than its long-term benefit.
Score 9 out of 10
Vetted Review
Verified User
We use Docker mainly for testing purposes. To avoid issues with local environments while testing our site, we use Docker images. This has many benefits: you can easily add/remove configurations and extras. For example on PHP you can try different images with different versions of PHP. Trying to achieve this on your localhost (for example with XAMPP or something more friendly) is very time-consuming.

I should say I know Docker is meant for something more pro and I'm a light user; I don't push a Docker image completely to a server, but for testing purposes it has been extremely useful. You can use CLI for changing things, you can create different databases, alter them and load them again, etc.
  • Creating and deleting "server" images is way easier than normal. You can change configurations and it basically creates a virtual machine on your computer, but WAY easier than using VMWare yourself. It's a layer on top of that.
  • Getting images is pretty easy, there are many on the internet and you can get help from the community in some cases you are not sure what to do
  • The commands in Docker work pretty well. There is good documentation and you can achieve almost anything considering a virtual machine.
  • Maintaining stability between environments thanks to the Docker app. You can have the SAME exact app on different systems (MacOS vs Win) and it will behave 99% the same.
  • As a NON-heavy user, definitely it's a bit intimidating in the onboarding phase. It's hard to understand what everything is for and how to use it appropriately. As I wrote before, this could be because I'm not a hard developer myself.
  • At least on Windows 10, I always have problems turning it on. It has problems starting, I need to quit/start again, and then it works. I'm supposed to have a stable version, not sure if it's only myself.
My use case is different from the "main" use case, but for me, Docker is great if you want to test different apps easily in local environments. I have never pushed an app to a server, but testing, creating, and deleting servers with different configurations with 2 commands is DEFINITELY easier than how I did it before. Creating environment variables and many configurations that can be shared across a simple file definitely makes things easier.

If you, like me, know something about developing but very little about Linux and distributions, be ready to test a lot of things and have a hard time achieving what you want. That's not Docker's fault, it's because it's meant for other users who are more "experts" in that field.
Score 10 out of 10
Vetted Review
Verified User
Docker is truly an amazing tool that is used across our organization. It gives the developer tools to easily set up environments, deploy code, CI pipeline. Open sourced images and community supports makes it a great choice.
  • Setting up Docker containers helps developers to replicate the production environment frim their local machine in a virtual box. This helps keep development and debugging simple.
  • Portability is really helpful. You can easily shift from AWS to GCP within minutes.
  • Docker images are version-controlled just like github commits.
  • User friendly - creating the virtual environment takes a lot more time than running the shell script to set up the environment.
  • Docker containers are for running applications and not for data containers. Having that feature would be awesome.
  • Docker image and containers prune command to force-delete all the images and containers as a cleanup.
It is best managed with cloud providers and setting up your CI pipeline. You probably would set up your images with access to file system,volume, environment variable.
Score 9 out of 10
Vetted Review
Verified User
Docker is being used across our organization for product development and deployment. We switched to docker in order to replace a custom build and deploy the solution. We have used it as part of our build and deploy system to make the process more flexible and decentralized. It has made it much easier to build out new data centers and is part of our internationalization strategy.
  • Docker provides encapsulation of our deployed software. This allows us to consistently deploy each of our services and webs in a customizable manner.
  • Docker makes it easy to build and release software in both development and production environments
  • Docker allows us to build common baseline environments for consistency across apps while adding app-specific customizations.
  • We have had problems using Docker for local development. There are issues with how it works with our asset build system.
Docker is well suited for providing a simple, standardized deployment of our systems that can be reliably reproduced in both development and production environments. Since using Docker, we have been able to decentralize the deployment reducing the load on our dev ops team and making continuous integration processes easier to establish and more flexible to use.
Nic Raboy | TrustRadius Reviewer
Score 10 out of 10
Vetted Review
Verified User
Docker is being used as a quick way for organizations to deploy Couchbase as a container, both independently and in an orchestrated environment. Because of how easy Docker containers are to create and destroy, it makes developer and operations work incredibly easy for prototyping in a consistent environment using the Docker runtime.
  • Container environment consistency
  • Lightweight deployments
  • Cross-platform
  • Hyper-V can cause problems for configuration on Windows environments
Docker is great for deploying headless applications like web applications, databases, etc., because it gives you a container environment that can be easily managed with a shell client. If you need to use a UI, it might make sense to use a VM instead.
Vlad VARNA | TrustRadius Reviewer
Score 10 out of 10
Vetted Review
Verified User
We are using Swarm for our analytics gathering service. Using swarm allows for quick workload scaling and using less hardware than was needed before.
  • Creating complex containers using docker files which automate a lot of DevOps manual labor
  • Having some preconfigured containers to do fast tests
  • The swarm takes away a lot of the work you would need to do for high availability
  • Kitematic UI is still very limited in functionality
  • Containers on Windows are somewhat hit and miss, Linux is strongly recommended
  • Swarm interface is mostly command line
  • Some network limitations (like remote client IP passthrough)
Great for light frontends and (REST) microservices that don't depend on hardware/drivers and just do DB/file IO. Not so great for dev virtual machines and testing complex network configurations.
Brian Dentino | TrustRadius Reviewer
Score 8 out of 10
Vetted Review
Verified User
Docker is used across our whole engineering organization. It is used to simplify packaging and deployment of the apps and services we develop. Using docker allows us to match our development environment more closely to production and run polyglot applications without worrying about cross-cutting software dependencies and server configuration.
  • Simple interface for defining and building an application runtime environment. This makes applications easy to inspect because aspects like exposed ports and environment variables can be defined declaratively and consistently.
  • Local environment parity with production. Docker manages dependency installation and allows you to easily run apps locally in the same environment as they run in production, giving you confidence that your app will work as expected when deployed and making configuration-related bugs easier to reproduce.
  • Makes applications easy to publish and distribute. Docker's image registry makes it extremely easy to publish your applications and distribute them securely. This makes deployment much simpler and provides version control for your application artifacts, making rollbacks very easy.
  • Docker has a bit of a learning curve, and it takes some time to become familiar with the tooling and syntax. Transitioning an existing architecture to docker can represent a significant investment.
  • Docker attempts to provide some level of cross-host container orchestration via swarm, but it falls short of third-party solutions like Kubernetes.
  • We occasionally run into stability issues when the docker daemon is subjected to high load (many applications starting/stopping frequently). In these cases, docker hangs and we have to restart or replace the node.
If you have an architecture that requires the use of multiple languages or many different microservices, docker is a great tool for managing the development and deployment of these services. It is also excellent for designing fault-tolerant production environments because 3rd-party orchestrators can be used to automatically replace failing applications with minimal server configuration. It may not be the best choice if you have a single monolithic application and a well-defined deployment pipeline.
January 30, 2018

Quick Docker Review

Score 10 out of 10
Vetted Review
Verified User
Docker is used across our whole engineering organization in order to have a consistent dev environment for local testing. We also use Docker for our microservices on Rancher. Docker is extremely useful as we can easily spin up any sort of environment we want and create/test new features. The use of Docker also helps prevent those "it work on my computer" type of issues.
  • Flexibility
  • Ease of Use
  • Very powerful
  • Can be seen as a black box
  • Hard to debug if unfamiliar with it
  • Semi-steep learning curve
Docker is well suited if you want to test new technologies or just having a consistent test environment across different machines. Docker also allows you to easily share your current local environment with anyone else regardless of their system. One drawback of Docker is the need to learn some of the quirks such as learning how to map ports and IPs to be accessible from your local machine. In the case where you don't need a strict environment control and only need to do some quick tests, docker can be overkill.
August 16, 2017

Docker FTW!

Score 10 out of 10
Vetted Review
Verified User
We use Docker to containerize our applications, we get many benefits from this such as:
  • consistent, realizable deployment environments across dev, QA, prod - the same image used in dev is the exact same image deployed to production
  • better utilization of server resources
  • cross cloud compatibility
  • the ease of scaling applications
  • Docker makes deployments easier across environments.
  • Docker allows to better utilization of server resources by easily allowing multiple applications (images) to run on the same server.
  • Docker makes it easy to scale our applications out.
  • Docker is somewhat new and new functionality comes with each release, sometimes it can be hard to stay on top of all the new features.
  • It would be nice if a full GUI based container management system came with Docker.
Docker is best suited for deploying Linux based apps. Eventually, it should (or will) be suited for Windows based apps as well.
Score 10 out of 10
Vetted Review
Verified User
Docker is used for by both the dev team and the QA team on my project. For the dev team it's really useful as they had a lot of issues prior to using Docker with the different setups the devs had: Win/Linux/Mac. After switching to Docker these issues disappeared.

For me as an automation QA lead, it's mainly used for our Selenium Grid. Our grid is running on AWS, and I configured it via Docker. I use docker-compose to start it up and to scale how many browsers should be started. Using only Docker was already a huge help, as we didn't really have to worry about the configurations and it was easy to use the same setup for more instances, but combined with the scaling option of docker-compose it proved to be a really convenient.
  • Develop on multiple platforms. The same Docker image can be used on Linux/Mac/Windows.
  • Ease of configuration. It's very easy to create a base image for your project. There are a lot of already existing images you can use to start with.
  • Scalability. If you need more than just one instance of the same image, it's just a command to spin up more.
  • Finding the perfect configuration: it's very easy to find some basic configurations, but fine-tuning it can be challenging.
  • Understanding the concept can be difficult at first. Most of the question I get from colleagues are around: what's happening inside the docker, how we can see the logs what happens inside etc. One you have the concepts, you can easily do these, but this can be a rough beginning.
  • Sometimes difficult to set it up. I'm mainly hearing about this from colleagues using Windows.
I most certainly would encourage everyone to try it. It might not be a good fit for their needs, but knowing about it definitely helps. For me it's very useful because of the way we can set up Selenium Grid with it. As official images are released for it, setting up a working Selenium Grid can be done in 1-2 single commands. If you use Docker Compose it's even easier to spin it up, just create a YML file describing the browsers you want to use, and with one single line you can spin up a grid with X number of different browsers and browser instances.
Score 7 out of 10
Vetted Review
Verified User
A large global financial services provider based in London, faced increasing regulatory pressure and market demands—led by industry disruptors offering modern, digital services. Looking to increase innovation and productivity, Barclays set out to build an Application Platform-as-a-Service as part of its cloud program. It used Red Hat OpenShift Container Platform which incorporates Docker, along with other Red Hat solutions to update its IT infrastructure and adopt an agile, DevOps approach to application development, giving its developers on-demand, self-service capabilities. As a result, the bank improved its efficiency and agility to innovate faster and stay competitive.
  • Docker brings in an API for container management, an image format and a possibility to use a remote registry for sharing containers. This scheme benefits both developers and system administrators.
  • Docker allows for portability across machines. The application and all its dependencies can be bundled into a single container that is independent of the host version of Linux kernel, platform distribution, or deployment model. This container can be transfered to another machine that runs Docker and executed there without compatibility issues.
  • Docker has a lightweight footprint and minimal overhead. Docker images are typically very small, which facilitates rapid delivery and reduces the time to deploy new application containers.
  • Docker allows for sharing. You can use a remote repository to share your container with others.
  • Docker provides great version control and component reuse. You can track successive versions of a container, inspect differences, or roll-back to previous versions. Containers reuse components from the preceding layers, which makes them noticeably lightweight.
  • Docker has got into the bad habit of wrapping open source Linux technologies and promoting them in a way that makes it feel like Docker invented it. They did it to LXC and they are doing it to aufs and overlayfs.
  • Docker is not very developer friendly.
  • Docker containers are currently for software, not for data.
  • New Docker versions cause breakage. You need all kinds of subtle regressions between Docker versions. It’s constantly breaking unpredictable stuff in unexpected ways.
  • Docker does not have a command to clean older images, lifecycle management.
  • Lack of kernel support.
Each Docker container’s purpose is to run a single application. As such, the scope for a Docker container is built towards a particular application, as opposed to an entire operating system. The file system inside a Docker container is isolated to provide an environment similar to a VM. Docker further incorporates container management solution that allows for easy scripting and automation. There is a strong focus on execution time for containerized applications and the ease of scripting. For developers looking for a performance breakdown between a Docker container and virtual machines, a container will win every time. That being said, some applications don't respond well to running in a container, such as containers that have high IO and need high performance persisted data mounted across multiple nodes.
June 26, 2017

Testing with Docker

Score 7 out of 10
Vetted Review
Verified User
Docker is being used by us to create and throw away spaces as needed for testing. Instead of managing a huge hardware lab we are able to "spin up" configurations as needed. If we need a new configuration to test against we just build a new container. It makes life more simple.
  • Docker is fast.
  • Docker is well documented.
  • Docker has public container registries.
  • Docker storage is still hard.
  • Docker has poor monitoring.
  • Docker is platform-dependent.
One of the coolest things about Docker that people tend to overlook, I think, is the way it has made public repositories the go-to way to distribute and install software. I’m referring to Docker Hub, which hosts thousands of container images that anyone can grab in just a single command.
Eric Mann | TrustRadius Reviewer
Score 10 out of 10
Vetted Review
Verified User
We use Docker as the backbone of our hosted app infrastructure. Every element of our application is broken down into a microservice; these miniature services are then built into Docker containers and deployed directly to AWS cloud. Docker lets us deterministically build, distribute, and deploy all of our services without any ambiguity as to what's being deployed and why.
  • Deterministic application state and deployments.
  • Consistent version history for previous builds.
  • Easy distribution mechanisms across the team.
  • Docker does update quickly, sometimes the updates to the engine break older container images.
  • Some of the changes to the Dockerfile structure are confusing and incompatible with older versions, challenging teams.
Docker is incredibly useful if you're deploying and hosting your own application infrastructure. It leads to reusable components that can be linked together in order to build a fully-functional, reusable system.

However, if your application is simpler and hosted on something like Elastic Beanstalk or AWS Lambda or RedShift, Docker might be overkill for the application development team.
Anudeep Palanki | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
Recently at Monsanto, there is a big push towards a DevOps model and micro-services. As our first step towards moving to the cloud, we started using docker to spin up new databases for various micro-services. When moving towards micro-services, we need a simple and consistent way to spin up database instances that do not affect each other. We needed consistency because we want the instances to be same across different environments.
  • Simple and reliable way to replicate instances.
  • Not needing to worry about internal workings of the instance as Docker takes care of managing the instance.
  • Very well documented API with large community support.
  • Verifiable Docker files, that allow us to look at what exists within a Docker file.
  • Managing backups of Docker instances does not scale well as the size of instance grows. The entire Docker instance needs to be stopped for the backups to happen and it's not always scalable.
  • While there are a lot of useful methods on CLI. The API for CLI is slow to evolve, leaving much to be desired. For example executing commands on the Docker instances, maintaining instances requires hacks using the CLI.
  • Writing a Docker file and debugging it is not always intuitive. Requires some trial and error to get it right.
Well suited for:

  • Small scale persistent databases.
  • Replicating the runtime environment.
  • It's also well suited for use with micro-services, where multiple small size databases need to spin up easily and consistently across environments.
It's less appropriate for database instances where backing up instances is not always scalable. It also does not fit well where monitoring instances is important, it requires a lot of additional code to manage instances.
Return to navigation