Your network is continually growing in complexity. And while that means you have more resources for “doing,” you also have more to protect. As you add more applications and endpoints to your network, you need to know that you’re going to be able to manage and control them. But how can you do that without spending an unrealistic amount of time and money?

Common Deployment Scenario

For an application to run it needs multiple services (e.g. webserver, frontend, caching, and orchestration) all running on some machine – virtual or a server in a datacenter. These multiple services can present a number of problems. For example:

  • Compatibility of each service with the underlying OS, libraries, and dependencies of OS
  • Compatibilities of OS changes when a service needs an update
  • Conflicts between programs running on the same computer such as library dependencies or ports
  • Costs for running services on virtual machines or servers

Using containers can alleviate most of these problems.

Benefits of Containers

Containers are a streamlined way to build, test, deploy, and redeploy applications on multiple environments from a developer’s local laptop to an on-premises data center and even the cloud. Benefits of containers include:

  • Less overhead. Containers require fewer system resources than traditional or hardware virtual machine environments because they don’t include operating system images.
  • Increased portability. Applications running in containers can be deployed easily to multiple different operating systems and hardware platforms.
  • More consistent operation. DevOps teams know applications in containers will run the same, regardless of where they are deployed.
  • Greater efficiency. Containers allow applications to be more rapidly deployed, patched, or scaled.
  • Better application development. Containers support agile and DevOps efforts to accelerate development, test, and production cycles.

Using Docker

Docker is a tool that allows you to easily create, run, and deploy your applications in containers. The lightweight, virtual containers can be easily deployed on a server without concern for the underlying system. Docker gives developers power and flexibility to deploy in different environments without worrying about dependencies. As Docker is an open-source technology built directly on top of Linux containers, it is much faster and lightweight when compared to VMs (Virtual Machines).

Application Security

But what about security? With all these applications and potential duplications, how do you keep your system secure? One way to prevent security breaches is to regularly scan your images and compare the dependencies to a known list of common vulnerabilities and exposures (CVEs). The automatic detection of vulnerabilities helps increase awareness and best security practices across developer and operations teams. It encourages action to patch and address the vulnerabilities.

A more effective way when using containers is to use CoresOS Clair for static analysis of vulnerabilities in container images.

Clair Vulnerability Scanning

Clair is an open-source vulnerability scanning platform by CoreOS that provides static analysis of Docker Images. It’s an API-driven analysis engine that inspects containers layer-by-layer for known security flaws. Clair scans each container layer and provides a notification of vulnerabilities that may be a threat, based on the CVE database and similar data feeds from Red Hat, Ubuntu, and Debian.