[Answer] How To Mitigate Container Security Risks With Next-Gen Technology?


Are you planning to migrate your enterprise to the public cloud or multi-cloud? Before that, you want to realize some of its inherent benefits regardless of what service model you utilize. In this article, we will analyze the containerization technology to answer all your queries.

What is containerization?

Containerization is the underlying technology that has accelerated the adoption of new methodologies to develop, deploy, and run applications in the cloud. In contrast to virtual machines (VMs), containers improve agility, scalability, portability, and cost efficiency by better utilizing the IT infrastructure.

Whether your goal is cost optimization, scalability, or elasticity, the cloud can allow your enterprise to adopt newer, cutting-edge technologies to innovate your business without the burden of having to own and manage everything in-house.

One of the areas that have seen exponential growth in application modernization. Cloud technology has enabled companies to speed up their application development processes by leveraging newer technologies and platforms, giving rise to so-called cloud-native applications. This is where containers come in.

How to migrate smoothly?

Enterprises often struggle to adapt to new management paradigms as they migrate their workloads to public clouds. When on-premises, they are solely responsible for everything (hardware, operating system, middleware, software, security, etc.), but once they migrate to the cloud, cloud service providers (CSPs) take over varying responsibilities depending on the service model (e.g., IaaS, PaaS or SaaS). Although CSPs, particularly the hyperscalers like Amazon Web Services (AWS), Microsoft, and Google, provide some cloud-native security controls, they might not be enough to meet your security and compliance needs. It isn’t always clear where their security responsibilities begin and end.

When it comes to containers, it isn’t any different. Some responsibilities fall under the provider and you will be accountable for others. One of the responsibilities that fall under your purview is container security. The question you should ask is, what are the CSPs exactly securing (and thus, what are they not)?

Common Container Security Risks

When it comes to securing your container environment, it’s important to look at all container components and understand what security controls, if any, the CSP will apply to each layer of your container environment. As specified by the National Institute of Standards and Technology (NIST), there are five risk areas that need to be secured on any given container environment:

  1. Container images
  2. Container registries
  3. Orchestrators
  4. Runtime security
  5. Host operating system (OS) security

The CSPs are typically responsible for securing their infrastructure (risk area #5), including deployment, configuration, and management of the underlying hardware and OS on which the containers will be running. Keeping the OS up to date and properly patched, hardening the hardware and other infrastructure maintenance tasks are usually part of their responsibilities. This is also true for typical Kubernetes or platform-as-a-service (PaaS) solutions (risk area #3), where the CSP not only provides the infrastructure security but also applies some security controls for the orchestration platform itself.

Your container environment must be secured layer by layer. You must be absolutely clear about what is your responsibility and what isn’t. The universal truth in cloud security is that cloud customers are ultimately responsible for securing their business data.

Container Security Responsibilities

In most cases, providing security for container images (risk area #1), container registries (risk area #2) and runtime containers (risk area #4) will fall under your responsibility.

With that in mind, let’s take a typical PaaS solution based on Kubernetes to illustrate this better. In this scenario, the provider is responsible for the infrastructure-as-a-service (IaaS) components (compute, network, storage), as well as managing the Kubernetes solution. The customer is then responsible for deploying, running, and securing their applications. Some of these security responsibilities are:

How does one perform Vulnerability Management?

A classic misconception in the market is that vulnerability scanning should only be done during the build phase — i.e., at the continuous integration/continuous deployment (CI/CD) level — but it is critical to use those same vulnerability scanning capabilities throughout the container life cycle, from the CI/CD tooling (build phase) to the registry (ship phase) and runtime (run phase).

When covering the entire life cycle, you can really be looking at thousands of vulnerabilities ranging from applications, containers, and the underlying host OS. Especially when you start breaking down the container images and realize that the base image, middleware, libraries, and dependencies are outdated. Now scale that to hundreds of images.

Then the question comes up: What should we fix first? This will definitely require some skilled resources to identify, analyze, and prioritize all vulnerabilities before they can be remediated. Performing true vulnerability management can be a daunting and time-consuming exercise.

Protection of Application Container Workloads

Traditional security solutions like firewalls or intrusion prevention systems (IPS) do not provide the level of visibility needed on container environments. Additional specialized tooling may be required. This will also require skilled resources with a keen understanding of cloud-native technologies to design, implement, and manage the proper policies to secure your applications during runtime.

With that said, it is important to establish the correct security policies to prevent any activities that deviate from standard behavior and prevent any configuration drifts or rogue container behavior; for example, having a container using more resources than it should or establishing networking communications to unknown destinations. This should all be prevented by having the correct policies in place.

Likewise, container security policies need to keep up with the application as it evolves. Having a management framework to anticipate application changes might be desired, where development and security teams have a direct line of communication to work more proactively and prevent some bad functioning from the application.

Continuous Event Monitoring

Applying container security is not as easy as simply running a tool. It requires some active monitoring of real-time events to identify and prevent any actual incidents from occurring — for example, a hacker leveraging any open ports to infect and spread malicious software on your environment to create botnets or do some crypto mining.

Whatever malicious activity it is, without having a well-established process to analyze all these types of events, it would simply take days, weeks, or even months to detect that level of activity. Hence the need to have a threat monitoring solution that can help keep the environment compliant and secure.

How To Identify and Fill Security Gaps?

Container security considerations involve many layers, so it is critical to understand the security capabilities of your CSP. You need to identify if they meet your organization’s container security and compliance needs, determine what gaps exist, and build a future-state road map to get there. Getting some software licenses and purchasing a product to support container security might not be good enough. You need the resources to continuously manage and monitor the security of your container environment with appropriate processes and controls.

What will its impact be on the shipping?

The impact of technology on the shipping industry will grow in the coming years as companies increasingly introduce cloud-based software and apps in their businesses. Cloud computing services make operations, communications, and collaboration are easier for workforces across the planet.

It’s of no surprise that more shipping businesses are now more willing to use cloud-based systems. Cloud computing, through which staff can access operations software on any computer with a web browser.

It believes that the technology solves a major problem facing the maritime industry because it facilitates increased ship capacity without requiring expensive infrastructure investments, or personnel training.

For example, by having your data in the cloud, you could outsource tasks such as ship inspection and have the test results uploaded into the cloud instead of sending those via email. Emails are no longer the best way to communicate within the company as it is error-prone. That’s just what you did in the 1990s.

Another change we will see is that shipowners will require employees who specialize in data and technology. If your job all day is just sending out emails, forwarding information from one point to the other, you will not be needed anymore so you need to have a different set of skills. This could be a data analyst who looks at all the data, to figure out what needs to change in the ship etcetera.

Did you subscribe to our daily newsletter?

It’s Free! Click here to Subscribe!

Source: SecurityIntelligence & Ship-Technology


This site uses Akismet to reduce spam. Learn how your comment data is processed.