Project Loom: New Java Virtual Threads

By using Java 21 lightweight threads, developers can create high-throughput concurrent applications with less code, easier maintenance, and improved observability.

For many years, the primary way to propose changes to the Java language and the JVM has been through documents called JDK Enhancement Proposals (JEPs). These documents follow a specific format and are submitted to the OpenJDK website.

While JEPs represent individual proposals, they are frequently adopted as groups of related enhancements that form what the Java team refers to as projects. These projects are named rather randomly, sometimes after things (Loom, where threads are turned into cloth) or places (Valhalla, the fabled hall of Norse mythology) or the technology itself (Lambda).

Project Loom’s main objective is to enhance the capabilities of Java for concurrent programming by offering two key features: efficient virtual threads and support for structured concurrency.

Java Platform Threads

Every Java program starts with a single thread, called the main thread. This thread is responsible for executing the code within the main method of your program.

Tasks are executed one after another. The program waits for each task to complete before moving on to the next. This can lead to a less responsive user experience if tasks take a long time (e.g., network requests)

Both asynchronous programming and multithreading are techniques used to achieve some level of concurrency in your code, but they work in fundamentally different ways:

Asynchronous Programming focuses on non-blocking execution of tasks. It initiates tasks without waiting for them to finish and allows the program to continue with other work. This doesn’t necessarily involve multiple threads. It can be implemented even in a single-threaded environment using mechanisms like callbacks and event loops.

While asynchronous programming offers advantages, it can also be challenging. Asynchronous calls disrupt the natural flow of execution, potentially requiring simple 20-line tasks to be split across multiple files and threads. This complexity can significantly increase development time and make it harder to understand the actual program behavior.

Multithreading focuses on concurrent execution of tasks. It creates multiple threads, each running its own instructions, allowing them to potentially execute at the same time (depending on available resources). involves multiple threads running concurrently and focuses on dividing and executing tasks truly in parallel. 

While Java Virtual Machine (JVM) plays a crucial role in their creation, execution, and scheduling, Java threads are primarily managed by the underlying operating system’s scheduler.

As a result, Creating and managing threads introduces some overhead due to startup (around 1ms), memory overhead(2MB in stack memory), context switching between different threads when the OS scheduler switches execution. If a system spawns thousands of threads, we are speaking of significant slowdown here. 

Multithreading offers potential performance benefits, it introduces additional complexity due to thread management and synchronization.

The question that arises is: how to get the simplicity of synchronous operations with the performance of asynchronous calls?

Java Virtual Threads

Enter Java Virtual Threads(JEP 425). Introduced in Java 17 with Project Loom, aim to reduce this overhead by being managed within the JVM itself, potentially offering better performance for certain scenarios.

It’s important to note that Project Loom’s virtual threads are designed to be backward compatible with existing Java code. This means your existing threading code will continue to work seamlessly even if you choose to use virtual threads.

In traditional java threads, when a server was waiting for a request, the operating system was also waiting.

Since virtual threads are controlled by JVM and detached from the operation system, JVM is able to assign compute resources when virtual threads are waiting for response.

This significantly improves the efficiency of computing resource usage.

This new approach to concurrency is possible by introducing something called continuations and structured concurrency.

Continuation is a programming technique that allows a program to pause its execution at a specific point and later resume at the same point, carrying the necessary context.

The continuation object is used to restore the thread’s state, allowing it to pick up exactly where it left off without losing any information or progress

Structured concurrency(JEP 453) aims to provide a synchronous-style syntax for working with asynchronous tasks. This approach simplifies writing basic concurrent tasks, making them easier to understand and express for Java developers.

Structured concurrency simplifies managing concurrent tasks by treating groups of related tasks across different threads as a single unit. This approach makes error handling, cancellation, reliability, and observability all easier to manage.

Project Loom’s innovations hold promise for various applications. The potential for vastly improved thread efficiency and reduced resource needs when handling multiple tasks translates to significantly higher throughput for servers. This translates to better response times and improved performance, ultimately benefiting a wide range of existing and future Java applications.

Scroll to Top