Load Testing A Dockerized Application

7 min read
Jul 13, 2023 8:42:00 AM

The number of production services that are deployed and running in containerization platforms such as Docker has grown dramatically in recent years. With this radical shift in the deployment architecture, the need for robust load testing of applications and services while running in containers is crucial.

In this article, we’ll discuss some of the key considerations to keep in mind when it comes to load-testing Docker containers and talk about how we might approach load-testing an application that’s deployed in one or more containers.docker-logo-1

What is Docker?

Docker is a leading containerization platform. Although there are other platforms available in the containerization market, Docker is by far the most widely used as of today.

Docker, like other containerization platforms, will package your application source code and all the required dependencies needed for it to run into a single container. The containers that Docker creates are standalone, lightweight packages of code that encompass everything needed for an application to run in isolation.

The main advantage of this approach is that the subsequent container should be able to run in any environment without considering the operating system or other host configuration.

 

What are the Key Features and Benefits of a Docker Container?

Leveraging Docker containers enables software developers to focus on writing application code without the burden of worrying about how it will run in different environments. The performance and behavior of the containerized application running in a local environment should be close to production as long as the host computer where the containers are running is of a similar specification.

Some of the other key features of Docker include:

 

Considerations when Load Testing Docker Containers

As more and more web applications shift to a containerized landscape, the need to conduct thorough performance testing of the Docker containers that the application runs in becomes ever more apparent. 

We need to test our application as it will be running in production – i.e., inside a Docker container – to have confidence that it will perform as expected.

 

What are the Challenges of Load Testing Docker Containers?

In comparison to traditional load testing approaches, there are some challenges that we need to be aware of when it comes to load testing Docker containers.

Docker containers are designed to be scalable, whereby multiple instances of the same container will run in our environment simultaneously. We need to ensure that the container orchestration, which handles the scaling logic, closely matches both the test and production environments. A few popular Docker orchestration solutions include:

As well as traditional load testing metrics such as response time and throughout, there are other container-specific metrics that should be monitored during the load test. These can include container restart rates, resource usage per container, and container-specific logs.

 

We also need to remember that Docker containers share the host system’s resources – such as CPU, memory, and network bandwidth. Ideally, there should be no other containers running and sharing resources on the host system during the test; the load test results can become skewed in ways that we don’t expect.

With that being said, another interesting test could be to assess the container’s performance when it is sharing resources with other containers on the same host. If you know that your containers will be sharing resources in the production environment, it can be worthwhile to simulate the same landscape in the test environment.

It’s common for applications with existing performance issues to be ported to containers without any significant refactoring taking place. Doing this means that the existing underlying performance problems of the application get transferred to the container. It can actually become significantly more difficult to discover due to the additional layer of abstraction that has been added.

 

What benefits are there to the load testing of Docker Containers?

Investing time and resources in the load testing of the Docker containers that will run our production application has some significant benefits.

docker-containers

 

Whilst we are busy developing our application, it typically won’t be running in a Docker container. When the application is deployed into production and ready to take live traffic, the application will likely be running in a container, which represents a different environment to the one it was developed in. We can’t be confident that the application will perform as expected unless we have explicitly run performance tests of the application while running in a Docker container.

 

Since Docker containers, by their nature, are designed to be scalable, load testing validates the scalability of our application when it is running in a containerized environment. By analyzing the performance metrics and resource utilization patterns during load tests, we can estimate the infrastructure requirements for scaling containerized applications.

Docker containers are often deployed in distributed environments and managed by container orchestration platforms. By introducing failures, such as container crashes during the load test, we can assess the application’s fault tolerance and behavior in adverse conditions.

 

Load Testing Docker Containers in Practice

How do we Prepare for Load Testing of Docker Containers?

We prepare for load testing of Docker containers in mostly the same way as we do for traditional performance or load tests, i.e.:

  • Defining clear load testing objectives and success criteria, such as the expected response time of transactions or the amount of throughput the system can handle
  • Procuring and establishing a test environment that is representative of the production environment the application will eventually go live in
  • Choosing an appropriate load testing tool that is capable of generating sufficient traffic to test the system and creating realistic user journey scenario scripts.

 

In addition to the above, we must ensure that container-specific monitoring and logging tools are in place and configured for real-time insights into the containers’ performance during the load test. Container monitoring can be achieved using APM (Application Performance Monitoring) tools. A few popular choices include:

 

The strategy for scaling the containers during the load test (i.e., spinning up additional containers when traffic meets a certain threshold) should also be determined before we execute the load test.

Gatling is an excellent choice for load testing your application running in a Docker container. You can use Gatling to create user journey scripts through your application and then execute a stress test once the application is running in a containerized fashion.

Here’s an example of a simple user journey Gatling script written in the Java version of Gatling that calls two different API endpoints with a 5-second gap between the calls:

 

import io.gatling.javaapi.core.*;
import io.gatling.javaapi.http.*;

import static io.gatling.javaapi.core.CoreDsl.*;
import static io.gatling.javaapi.http.HttpDsl.*;

public class MyFirstTest extends Simulation {

    // 1 Http Configuration
    private HttpProtocolBuilder httpProtocol = http
            .baseUrl("http://localhost:8080")
            .acceptHeader("application/json");

    // 2 Scenario Definition
    private ScenarioBuilder scn = scenario("My User Journey")  
            .exec(http("First API Call")
                    .get("/foo"))
            .pause(5)
            .exec(http("Second API Call")
                    .get("/bar"));

    // 3 Load Scenario
    {
        setUp(
                scn.injectOpen(atOnceUsers(1))
        ).protocols(httpProtocol);
    }

}

 

When you’re ready to start executing more realistic distributed load tests against your containerized application, Gatling Enterprise is a natural choice. The Enterprise version of Gatling enables you to deploy a distributed test environment in the cloud with multiple load injectors in different availability zones in just a few clicks. You can then execute a load test against your containerized application using the same Gatling user journey scripts created previously.

 

What Metrics do we monitor when Load Testing Docker Containers?

We mentioned above the importance of monitoring during load testing of Docker containers, but what are the actual metrics that we need to look out for?

As with traditional performance testing, we need to monitor the resource utilization of the containers that are being tested. Standard metrics to monitor will include:

  • CPU usage
  • Memory usage
  • Network bandwidth consumed
  • Disk I/O operations

Docker containers additionally offer metrics specific to their runtime environment. These include metrics such as container start time, restart rate, and resource usage per container. It may also be necessary to monitor the specific logs and events of individual containers if we need to investigate any unexpected behavior. 

The application performance monitoring tool that you utilize when executing load testing of Docker containers should be able to provide all of the above capabilities as standard. These APM tools work by instrumenting the Docker containers with a lightweight agent and then reporting the resource utilization metrics to a real-time dashboard.

Gatling facilitates load-testing APIs by simulating vast amounts of traffic from single or multiple load injectors. Whilst the stress test is executing, you can utilize the Gatling Enterprise advanced reporting dashboard to monitor the performance of your application in real-time, in conjunction with your chosen container monitoring tool that provides information on resource utilization metrics.

 

What are the Best Practices for Load Testing Docker Containers?

One of the most important aspects of load testing Docker containers is to ensure that we start load testing early on in the software development lifecycle. The best way to do this is to integrate load testing into your CI/CD process. Gatling’s load testing as code methodology means that it fits perfectly into your CI/CD automation pipeline.

By deploying your application into a Docker container and running load tests early on, you’ll be able to find and resolve any performance-related issues before they become too cumbersome. 

 

As well as closely monitoring the performance and health of the Docker containers during load testing, another important aspect of testing is the failover and recovery of your application. You should introduce failure scenarios into your load testing, such as container crashes, network failures, or other infrastructure disruptions. You need to assess the impact of these events on the application during load testing of Docker containers and whether or not the system can recover sufficiently.

Your approach to load-testing applications running in Docker containers should be based on benchmarking load-test results. Gatling Enterprise enables you to quickly execute repeatable distributed load tests and compare the results of any previous test runs in detail directly in the advanced graphical reporting interface.

 

Conclusion

The requirement for performance testing of applications running in Docker containers is more important than ever. With the vast majority of applications now running in containerization platforms, load testing of these containers is crucial to ensure scalability and performance.

Gatling enables you to create load-testing scripts that are capable of simulating high levels of traffic against applications running in containers. Our comprehensive documentation has everything you need to get started right away.

There are many advantages to containerizing applications, but ensuring that your application still performs optimally whilst running inside these platforms is critical to ensure that the demands of your customers and users are met.