Beyond Traditional Threads: Exploring the Unlimited Potential of Java Loom Coroutines (Fibers/Virtual Threads)

1. The development history of Java coroutine

The development process of Java coroutines can be summarized into the following stages:

  • In 1963, the concept of coroutines was formally proposed, and its birth was even earlier than threads.
  • In 2007, the Kilim project was released, which is a Java coroutine framework based on bytecode enhancement technology.
  • In 2014, the Quasar project was released, which is a Java coroutine framework based on Agent technology.
  • In 2016, the Project Loom project was launched, which is a project aimed at providing native coroutine support for Java.
  • In 2019, the Kotlin language released version 1.3, which is a Java-compatible language that supports coroutine programming.
  • In 2020, Java 15 will be released, which includes the preview version of Project Loom (Preview Feature), which provides features such as Virtual Thread (Virtual Thread) and Scope Variable.
  1. Early Attempts: In early versions of Java, concurrency was primarily achieved through threads and synchronization mechanisms. Although Java provides multi-thread support, due to the high cost of thread creation and switching, this makes the processing efficiency of high-concurrency scenes low.
  2. Quasar (2011): Quasar is a Java coroutine library based on bytecode manipulation and bytecode instrumentation developed by Parallel Universe. It implements a coroutine-like concept in Java, by manipulating bytecodes to achieve lightweight coroutines and task switching. Quasar provides a coroutine programming method that allows multiple coroutines to be executed in one thread, thereby avoiding the overhead of thread switching.
  3. Kotlin Coroutines (2017): Kotlin is a programming language that runs on the JVM, developed by JetBrains. Kotlin Coroutines is an asynchronous programming framework for Kotlin that allows asynchronous code to be written in a coroutine manner. Although Kotlin is a standalone programming language, it integrates seamlessly with Java, so you can use Kotlin Coroutines in your Java projects.
  4. Project Loom (in progress): Project Loom is a sub-project of OpenJDK dedicated to introducing lightweight threads (called Virtual Threads or Loom Threads) to Java. The goal of Loom is to add fiber capabilities to Java without changing existing Java programs. The design goal of Loom is to implement an efficient and easy-to-use coroutine and lightweight threading model to solve the challenges of Java concurrent programming.

There is still no native support for coroutines in the Java standard library. However, the emergence of the above-mentioned projects and libraries shows the Java community’s demand for efficient concurrent programming, as well as the exploration and practice of coroutine programming, and the future development of Java may further introduce more advanced concurrency mechanisms to provide developers with more elegant and Efficient concurrent programming experience.

2. Java Project Loom

Java Project Loom is an important project of the Java language, which aims to improve the execution model of the Java Virtual Machine (JVM) to support Lightweight Threads (Lightweight Threads), thereby improving Java’s performance and scalability in dealing with concurrent and parallel programming. Scalability. This article describes the background, goals, main features, and impact on Java developers and applications of Java Project Loom.

1. Background

In Java development, thread (Thread) is a commonly used concurrency mechanism that allows programs to run simultaneously with multiple independent execution paths. However, there are some problems with the traditional Java threading model. Each thread is mapped to the operating system’s native thread, which results in high overhead for creating and destroying threads. Moreover, since each thread will occupy a certain amount of memory space, when the degree of concurrency is high, the creation of a large number of threads may cause excessive memory consumption and even cause the system to crash.

In order to solve these problems, the Java Project Loom project came into being.

The main goal of Java Project Loom is to introduce a lightweight threading implementation called “Fibers” to optimize the Java thread management and execution model. Fibers are user-mode threads that are managed by the Java Virtual Machine (JVM) and runtime system, and no longer need to be mapped to native threads of the operating system. In this way, the creation and destruction overhead of Fibers will be greatly reduced, and a large number of Fibers can be run within the same operating system thread, thereby reducing memory consumption and improving performance.

2. Main Features

Java Project Loom brings many important features, the most notable of which are:

2.1. Fibers

Fibers are a core feature of Java Project Loom. They are a lightweight, userland thread implementation that can be created, suspended, resumed, and canceled through the Fiber API. Compared with traditional threads, Fibers are less costly to create and destroy, and can efficiently reuse thread resources, allowing applications to have thousands or even millions of concurrently executing Fibers without incurring significant memory overhead.

2.2. Continuations

To support Fibers, Java Project Loom introduced the concept of Continuations. Continuations allow saving the execution state of a Fiber while it is suspended, and reverting to the suspended state when needed. This provides an efficient mechanism for suspending and resuming Fibers, avoiding the overhead of traditional thread context switching.

2.3. Virtual Threads

Java Project Loom also introduces the concept of Virtual Threads, which is a mechanism for transparent encapsulation of Fibers. Virtual Threads can dynamically create and manage Fibers according to the needs of applications, allowing developers to use a simple programming model to handle large-scale concurrency without worrying about thread management details.

2.4. Scoped Threads

Scoped Threads is another important feature of Java Project Loom, which allows Fibers to run within a limited scope. In this way, the Fiber will be automatically destroyed after going out of its scope, thereby avoiding resource leaks and the complexity of thread management.

3. The impact of Project Loom

The introduction of Java Project Loom will have a profound impact on Java developers and applications:

3.1. Higher concurrent performance

By introducing lightweight Fibers, Java Project Loom will enable Java applications to handle a large number of concurrent tasks more efficiently, thereby providing higher concurrent performance and better scalability.

3.2. Lower memory consumption

Since Fibers no longer need to be mapped to the native threads of the operating system, the memory consumption of Java applications will be significantly reduced, especially in high-concurrency scenarios, which will be especially important for resource-constrained environments and cloud computing platforms.

3.3. More concise code

The introduction of Virtual Threads and Scoped Threads will simplify the code logic of concurrent programming, allowing developers to focus more on business logic without paying too much attention to underlying thread management.

3.4. Better responsiveness

The improvement of Java Project Loom will make Java applications more responsive, especially under high load and high concurrency, the application can still quickly respond to user requests.

Overall, Java Project Loom is an important step towards higher concurrency and better performance for the Java language. By introducing Fibers and related features, it will bring more powerful tools to Java developers, making it easier to develop efficient and highly concurrent Java applications. With the continuous development of the Java ecosystem, Java Project Loom will surely become an indispensable and important part of Java development.

4, Samples

Many applications do not use the Thread API directly, but instead use the java.util.concurrent.ExecutorService and Executors APIs. The Executors API has been updated with a new factory method for ExecutorServices that starts a new thread for each task. Virtual threads are cheap enough to create a new virtual thread for each task, never needing to pool virtual threads.

The following starts a dummy thread to print the message. It calls the join method to wait for the thread to terminate.

ini copy code Thread thread = Thread.ofVirtual().start(() -> System.out.println("Hello"));
thread. join();

Here’s an example that starts a dummy thread to enqueue elements after a sleep. The main thread is blocked on the queue, waiting for an element.

ini copy code var queue = new SynchronousQueue<String>();
Thread.ofVirtual().start(() -> {
    try {
        Thread. sleep(Duration. ofSeconds(2));
        queue. put("done");
    } catch (InterruptedException e) { }
});
 
String msg = queue. take();

Thread.Builder API can also be used to create ThreadFactory. The ThreadFactory created by the following snippet will create virtual threads named “worker-0”, “worker-1”, “worker-2”, etc.

ini copy code ThreadFactory factory = Thread.ofVirtual().name("worker", 0).factory();

The following example uses the Executors API to create an ExecutorService that starts a new virtual thread for each task. The example uses a try-with-resources construct to ensure the ExecutorService has terminated before continuing.

ExecutorService defines submit method to execute tasks. The submit method does not block, but returns a Future object that can be used to wait for the result or exception. The Submit method that accepts a collection of tasks returns a Stream that lazily populates with completed Future objects representing the results.

The example also uses the invokeAll and invokeAny combinator methods to perform multiple tasks and wait for them to complete.

rust copy code try (ExecutorService executor = Executors.newVirtualThreadExecutor()) {
      // Submits a value-returning task and waits for the result
    Future<String> future = executor. submit(() -> "foo");
    String result = future. join();
 
    // Submits two value-returning tasks to get a Stream that is lazily populated
    // with completed Future objects as the tasks complete
    Stream<Future<String>> stream = executor. submit(List. of(() -> "foo", () -> "bar"));
    stream. filter(Future::isCompletedNormally)
            .map(Future::join)
            .forEach(System.out::println);
 
    // Executes two value-returning tasks, waiting for both to complete
    List<Future<String>> results1 = executor.invokeAll(List.of(() -> "foo", () -> "bar"));
 
    // Executes two value-returning tasks, waiting for both to complete. If one of the
    // tasks completes with an exception, the other is canceled.
    List<Future<String>> results2 = executor.invokeAll(List.of(() -> "foo", () -> "bar"), /*waitAll*/ false);
 
    // Executes two value-returning tasks, returning the result of the first to
    // complete, canceling the other.
    String first = executor.invokeAny(List.of(() -> "foo", () -> "bar"));

}

3. How SpringBoot uses coroutines

1. Project Loom

Java Project Loom aims to improve the execution model of the Java virtual machine, the core concept of which is Fibers, also known as lightweight threads. Fibers provides a lightweight threading model that can efficiently create and manage a large number of concurrent tasks without consuming a large amount of system resources like traditional threads. Although Spring Boot itself does not integrate Project Loom, you can use Project Loom in Spring Boot applications to implement coroutines. To do this, you need to be using Java 17 or newer and include Project Loom as a dependency.

Here is a simple example showing how to use Project Loom’s Fibers to implement coroutines:

arduino copy code public class CoroutineExample {

    public static void main(String[] args) throws InterruptedException {
        ExecutorService executorService = Executors. newVirtualThreadExecutor();

        SubmissionPublisher<String> publisher = new SubmissionPublisher<>(executorService, 1);

        // Subscribe to the publisher
        publisher. subscribe(new SimpleSubscriber());

        // Publish some data
        for (int i = 0; i < 5; i ++ ) {
            publisher.submit("Data " + i);
        }

        // Close the publisher and wait for the subscribers to finish
        publisher. close();
        executorService. awaitTermination(1, TimeUnit. SECONDS);
        executorService. shutdown();
    }
}

2. Quasar Framework

Quasar is a Java-based coroutine library that provides implementation and management of coroutines. Using Quasar, you can create coroutines in Spring Boot applications to handle concurrent tasks.

To use Quasar, you need to add it as a dependency to your Spring Boot project. Then, you can use the API provided by Quasar to create, suspend and resume coroutines.

Here is a simple example showing how to use Quasar to implement coroutines in a Spring Boot application:

csharp copy code @FiberSpringBootApplication
    public class CoroutineExample {

        public static void main(String[] args) throws InterruptedException {
            new Fiber<Void>(() -> {
                for (int i = 0; i < 5; i ++ ) {
                    System.out.println("Data " + i);
                    try {
                        Fiber. sleep(1000);
                    } catch (SuspendExecution | InterruptedException e) {
                        e.printStackTrace();
                    }
                }
            }).start().join();
        }
    }

4. Third-party integration

4.1.Vert.x

Vert.x is an event-driven reactive framework that allows developers to write high-performance asynchronous applications using Java or other JVM languages. Vert.x has begun to try to integrate Java virtual threads. There is currently a virtual thread incubator project, which includes an async/await implementation. This project allows developers to write asynchronous code using a syntax similar to JavaScript or C#, without using callbacks or Futures.

Vert.x is a toolkit and framework for building responsive, performant, scalable applications. It is based on the Java language and provides an asynchronous programming model that allows developers to easily build event-driven, non-blocking applications.

Key Features and Benefits:

  1. Reactive and non-blocking: Vert.x employs an event loop and an asynchronous programming model that allows applications to process requests and events in a non-blocking manner, resulting in high throughput and low latency.
  2. Multilingual support: Although Vert.x is built in Java, it also supports other languages such as Kotlin, Groovy, and JavaScript. This allows developers to write applications in their favorite language.
  3. Built-in cluster support: Vert.x has built-in cluster support to run application instances on multiple nodes, enabling horizontal scaling and high availability.
  4. Rich components: Vert.x provides rich components and libraries, including HTTP server, WebSocket, message bus, database client, etc., enabling developers to quickly build various types of applications.
  5. Lightweight: Vert.x is a lightweight framework, not as bloated as some larger frameworks, and can run in environments with limited resources.
  6. Active community: Vert.x has an active open source community with continuous development and updates, keeping it at the cutting edge of technology, and has many contributors to support and expand it.

Vert.x is suitable for building various types of applications, especially those requiring high performance, high concurrency and real-time performance. It can be used to build web applications, API services, real-time communication applications, IoT applications, etc. If you’re interested in reactive programming and high-performance application development, Vert.x is worth checking out.

4.2. Jetty

Jetty is a lightweight Java web server and servlet container. Jetty also already supports Java virtual threads.

The virtual threads introduced in Java 19 are supported in Jetty 12 as they were supported in Jetty 0 and Jetty 12 starting with 10.11.10 and 0.12.11 respectively.

When a JVM supports virtual threads and is enabled in Jetty (see Embedded usage and Standalone usage), applications are invoked using virtual threads, which allows them to use simple blocking APIs but with the scalability benefits of virtual threads.

4.3. Tomcat

Tomcat is a widely used Java web server and servlet container. Tomcat also supports Java virtual threads, and there are related instructions in the version.

4.4. Helidon

Helidon is a microservice framework that provides two programming models: Helidon SE and Helidon MP. Helidon SE is a lightweight framework based on functional programming that supports Reactive Streams and non-blocking IO. Helidon MP is a standardized annotation-driven framework that supports the MicroProfile API. Helidon has also integrated Java virtual threads, and provides some sample code to show how to use it.

4.5. Quarkus

Quarkus is a full-stack framework for cloud-native applications, which provides features such as high performance, low memory footprint, fast startup, and hot reloading. Quarkus also already supports Java virtual threads, and provides some documentation and guides on how to use it.