Container Orchestration

Container Orchestration

In this tutorial, we are going to discuss on what container orchestration is.

Container Orchestration
Why Orchestrate?

So far we have discussed that with Docker you can run a single instance of the application with a simple Docker run command.

For example, to run a node js based application you’re on the docker run nodejs command.

$ docker run nodejs

But that’s just one instance of your application on one docker host. What happens when the number of users increase and that instance is no longer able to handle the load?

You deploy additional instance of your application by running the docker run command multiple times.

$ docker run nodejs

$ docker run nodejs

$ docker run nodejs

So that’s something you have to do yourself. You have to keep a close watch on the load and performance of your application and deploy additional instances yourself and not just that, you have to keep a close watch on the help of these applications.

And if a container was to fail you should be able to detect that and run the docker run command again to deploy another instance of that application.

What about the health of the docker host itself? What if the host crashes and is inaccessible? The containers hosted on that host become inaccessible too. So what do you do in order to solve these issues?

You will need a dedicated engineer who can sit and monitor the state performance and health of the containers and take necessary actions to remediate the situation.

But when you have large applications deployed with tens of thousands of containers. This is not a practical approach.

So you can build your own scripts and that will help you tackle these issues to some extent. Container orchestration is just a solution for that.

Container orchestration

It is a solution that consists of a set of tools and scripts that can help host containers in a production environment.

Typically a container orchestration solution consist of multiple Docker hosts that can host containers that way even if one fails, The application is still accessible through the others.

A container orchestration solution easily allows you to deploy hundreds or thousands of instances of your application with a single command. This is a command used for Docker swarm.

$ docker service create --replicas=100 nodejs

We will look at the command itself in a bit. Some orchestration solutions can help you automatically scale up the number of instances when users increase and scale down the number of instances when the demand decreases.

Some solutions can even help you in automatically adding additional hosts to support the user load. And not just clustering and scaling the container orchestration solutions also provide support for advanced networking between these containers across different hosts.

As well as load balancing user requests across different hosts. They also provide support for sharing storage between the host as well as support for configuration management and security within the cluster.

Container orchestration solutions

There are multiple container orchestration solutions available today. Docker has Docker swarm, Kubernetes from Google and Mesos from Apache.

While Docker swarm is really easy to setup and get started. It lacks some of the Advanced Auto scaling features required for complex production great applications.

Mesos on the other hand it’s quite difficult to setup and get started but supports many advanced features.

Kubernetes arguably the most popular of it all is a bit difficult to setup and get started but provides a lot of options to customize deployments and has support for many different vendors.

Kubernetes that is now supported on all public cloud service providers like GCP, Azure and AWS and the current project is one of the top ranked projects on GitHub.

Container Orchestration
Scroll to top