Workload characterization for microservices

Status:: 🟩
Links:: Microservices vs. Monolith

Metadata

Authors:: Ueda, Takanori; Nakaike, Takuya; Ohara, Moriyoshi
Title:: Workload characterization for microservices
Date:: 2016
URL:: http://ieeexplore.ieee.org/document/7581269/
DOI:: 10.1109/IISWC.2016.7581269

Notes & Annotations

Color-coded highlighting system used for annotations

📑 Annotations (imported on 2024-03-23#21:00:37)

ueda.etal.2016.workloadcharacterizationmicroservices (pg. 4)

In comparison with the monolithic implementations, the Node.js microservice implementation degraded throughput up to 79.2% and the Java microservice implementation degraded throughput up to 70.2%. This is a noticeably worse performance penalty than previously reported [3]. From this fact, we argue that Web-service developers should carefully consider the impact on performance before transforming a monolithic implementation into a microservice implementation.

ueda.etal.2016.workloadcharacterizationmicroservices (pg. 4)

The Docker network configurations exhibited non- negligible impact on performance. The bridge network that Docker uses as default exhibited up to 33.8% performance degradation in the Node.js implementations compared to the bare-process configuration. As expected, the Docker-host configuration always exhibited better performance than the Docker-bridge configuration because Docker host uses the interface of the host machine without virtualization. However, when we use the host interface, applications may cause a conflict of network ports if they use the same port. From this performance trend, we argue that developers should select an appropriate Docker network configuration depending on whether the developers have to avoid port conflicts.

ueda.etal.2016.workloadcharacterizationmicroservices (pg. 4)

There are other interesting performance results. Java exhibited better performance than Node.js on the 4 and 16 cores, except the 16-core bare-process configuration. Java exhibited super-linear scalability on the four cores. On the contrary, Node.js outperformed Java on the single core.

ueda.etal.2016.workloadcharacterizationmicroservices (image) (pg. 5)

Throughput comparison among Node.js and Java implementations with Bare-process, Docker-host, and Docker-bridge experimental configurations. Values are relative ratios of throughput based on throughput of Node.js monolithic implementation with bare-process configuration.

ueda.etal.2016.workloadcharacterizationmicroservices (pg. 9)

The results show that the Liberty layer is the main time-consuming part of the implementations. The com.ibm.ws package includes the main implementation of Liberty Core. The java.util package also generated a long path because the application server uses java.util package intensively. The org.jboss package consumed much more time. This means the communication of the microservice architecture caused the major bottleneck.

ueda.etal.2016.workloadcharacterizationmicroservices (image) (pg. 9)

Breakdown of Java implementations

ueda.etal.2016.workloadcharacterizationmicroservices (pg. 10)

Even though microservices can accelerate agile developments, we must recognize the negative impact on performance.