Reactive Programming with Spring Boot

πŸ”Ή What is Reactive Programming?

  • Reactive Programming is an asynchronous, non-blocking programming paradigm built around the idea of data streams and event-driven systems.
  • Instead of executing tasks sequentially and blocking threads (traditional imperative style), reactive systems:
    • React to events/data as they arrive.
    • Use publish-subscribe models.
    • Emphasize backpressure (control over fast producers and slow consumers).

πŸ‘‰ In short: It’s about building resilient, scalable applications that can handle high concurrency with fewer resources.


πŸ”Ή Why Reactive in Spring Boot?

Traditional Spring MVC (Servlet-based):

  • Each request blocks a thread until response is ready.
  • Good for I/O-light apps, but struggles at scale when waiting on DB, API calls, etc.
  • Reactive Spring (WebFlux-based):
  • Built on Project Reactor (implementation of Reactive Streams).
  • Uses event-loop model instead of one-thread-per-request.
  • Non-blocking I/O β†’ better resource utilization.
  • Great when your app makes lots of remote calls (DB, APIs, message brokers).

πŸ”Ή Key Components in Spring Reactive

  1. Spring WebFlux
    1. Alternative to Spring MVC for reactive applications.
    2. Uses Netty (by default) instead of Tomcat/Jetty.
  2. Project Reactor (core reactive library in Spring)
    1. Mono β†’ 0 or 1 result (like Optional/CompletableFuture).
    2. Flux β†’ 0…N results (like Stream).
  3. Reactive Streams API
    1. Interfaces: Publisher, Subscriber, Subscription, Processor.
    2. Defines how publishers and subscribers communicate asynchronously.

πŸ”Ή Example – Reactive REST API with Spring Boot

1. Add Dependency (pom.xml)

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-webflux</artifactId>
</dependency>

2. Reactive Controller

import org.springframework.web.bind.annotation.*;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;

import java.time.Duration;

@RestController
@RequestMapping("/users")
public class UserController {

    // Return a single user (Mono)
    @GetMapping("/{id}")
    public Mono<String> getUser(@PathVariable String id) {
        return Mono.just("User: " + id);
    }

    // Return multiple users (Flux)
    @GetMapping
    public Flux<String> getAllUsers() {
        return Flux.just("Alice", "Bob", "Charlie")
                   .delayElements(Duration.ofSeconds(1)); // simulate streaming
    }
}

πŸ‘‰ /users β†’ streams users one per second (non-blocking).


πŸ”Ή Reactive Database Example

Using Spring Data R2DBC (Reactive DB access):

import org.springframework.data.annotation.Id;
import org.springframework.data.relational.core.mapping.Table;
import reactor.core.publisher.Mono;
import reactor.core.publisher.Flux;
import org.springframework.data.repository.reactive.ReactiveCrudRepository;

@Table("users")
class User {
    @Id
    private Long id;
    private String name;
}

// Reactive repository
interface UserRepository extends ReactiveCrudRepository<User, Long> {}

// Service
@Service
public class UserService {
    @Autowired private UserRepository repo;

    public Mono<User> getUser(Long id) {
        return repo.findById(id);
    }

    public Flux<User> getAllUsers() {
        return repo.findAll();
    }
}

πŸ”Ή When to Use Reactive Programming?

βœ… Use it when:

  • High concurrency apps (chat, streaming, IoT).
  • Services making many I/O calls (DB, REST APIs, messaging).
  • You need real-time streaming (Kafka, WebSockets).

❌ Avoid if:

  • App is mostly CPU-bound.
  • Simpler blocking style is sufficient.
  • Your team isn’t comfortable with functional-reactive style (steep learning curve).

πŸ”Ή Benefits

  • High throughput, fewer threads needed.
  • Better resource utilization under load.
  • Built-in backpressure handling.
  • Works well with Kafka, MongoDB (Reactive driver), Cassandra, R2DBC.

βœ… In short:
Reactive programming with Spring Boot (WebFlux + Project Reactor) lets you build non-blocking, event-driven, highly scalable microservices, perfect for modern cloud-native systems.

Β 


πŸ”Ή Why WebFlux + Kafka?

  • WebFlux β†’ non-blocking, reactive REST API (Project Reactor: Mono/Flux).
  • Kafka β†’ high-throughput, distributed event streaming platform.
  • Together: build services where API requests trigger events to Kafka and Kafka streams are exposed/reactively to clients.

πŸ”Ή Key Tools in Spring Boot

  1. Spring WebFlux β†’ reactive REST APIs.
  2. Spring Kafka or Reactor Kafka β†’ Kafka producer/consumer in reactive style.
    1. spring-kafka = wrapper on Kafka client (not fully reactive).
    2. reactor-kafka = fully reactive integration with Project Reactor.

πŸ”Ή Example: Reactive Producer API (send to Kafka)

1. Dependencies (pom.xml)

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-webflux</artifactId>
</dependency>

<dependency>
    <groupId>org.springframework.kafka</groupId>
    <artifactId>spring-kafka</artifactId>
</dependency>

<dependency>
    <groupId>io.projectreactor.kafka</groupId>
    <artifactId>reactor-kafka</artifactId>
    <version>1.3.23</version>
</dependency>

2. Reactive Kafka Producer Config (Reactor Kafka)

import org.apache.kafka.common.serialization.StringSerializer;
import reactor.kafka.sender.KafkaSender;
import reactor.kafka.sender.SenderOptions;
import reactor.kafka.sender.SenderRecord;

import java.util.Map;

@Configuration
public class KafkaProducerConfig {

    @Bean
    public KafkaSender<String, String> reactiveKafkaSender() {
        Map<String, Object> props = Map.of(
            "bootstrap.servers", "localhost:9092",
            "key.serializer", StringSerializer.class,
            "value.serializer", StringSerializer.class,
            "acks", "all"
        );
        SenderOptions<String, String> senderOptions = SenderOptions.create(props);
        return KafkaSender.create(senderOptions);
    }
}

3. Reactive Controller Producing Events

import org.springframework.web.bind.annotation.*;
import reactor.core.publisher.Mono;
import reactor.kafka.sender.KafkaSender;
import reactor.kafka.sender.SenderRecord;

@RestController
@RequestMapping("/messages")
public class MessageController {

    private final KafkaSender<String, String> kafkaSender;

    public MessageController(KafkaSender<String, String> kafkaSender) {
        this.kafkaSender = kafkaSender;
    }

    @PostMapping
    public Mono<String> sendMessage(@RequestBody String message) {
        return kafkaSender.send(Mono.just(
                SenderRecord.create("test-topic", null, null, null, message, null)))
            .next()
            .map(result -> "Message sent: " + message);
    }
}

πŸ‘‰ Calling POST /messages with a payload pushes the message into Kafka reactively.


πŸ”Ή Example: Reactive Kafka Consumer (WebFlux + Kafka Stream)

1. Consumer Config

import org.apache.kafka.common.serialization.StringDeserializer;
import reactor.kafka.receiver.KafkaReceiver;
import reactor.kafka.receiver.ReceiverOptions;

@Configuration
public class KafkaConsumerConfig {

    @Bean
    public KafkaReceiver<String, String> reactiveKafkaReceiver() {
        Map<String, Object> props = Map.of(
            "bootstrap.servers", "localhost:9092",
            "group.id", "webflux-consumer",
            "key.deserializer", StringDeserializer.class,
            "value.deserializer", StringDeserializer.class,
            "auto.offset.reset", "earliest"
        );

        ReceiverOptions<String, String> options =
            ReceiverOptions.<String, String>create(props)
                .subscription(List.of("test-topic"));

        return KafkaReceiver.create(options);
    }
}

2. Streaming Messages as a Flux Endpoint

import org.springframework.web.bind.annotation.*;
import reactor.core.publisher.Flux;
import reactor.kafka.receiver.KafkaReceiver;
import reactor.kafka.receiver.ReceiverRecord;

@RestController
@RequestMapping("/stream")
public class KafkaStreamController {

    private final KafkaReceiver<String, String> receiver;

    public KafkaStreamController(KafkaReceiver<String, String> receiver) {
        this.receiver = receiver;
    }

    @GetMapping(produces = "text/event-stream")
    public Flux<String> streamMessages() {
        return receiver.receive()
                .map(ReceiverRecord::value)   // extract message payload
                .doOnNext(val -> System.out.println("Consumed: " + val));
    }
}

πŸ‘‰ Hitting /stream gives you a Server-Sent Events (SSE) endpoint that streams Kafka messages reactively to the browser or client.


πŸ”Ή Flow Summary

  1. Client sends REST request β†’ WebFlux Controller β†’ publish to Kafka.
  2. KafkaConsumer (reactor-kafka) consumes β†’ exposed as a Flux stream over SSE/WebSocket.
  3. Entire flow is non-blocking, backpressure-aware, and scalable.

πŸ”Ή When to Use WebFlux + Kafka

βœ… Real-time dashboards (stock prices, IoT sensors, live monitoring).
βœ… Chat systems / notifications.
βœ… Event-driven microservices (billing, order tracking, fraud detection).
βœ… Streaming APIs to clients (web/mobile).


βœ… In short:


Spring WebFlux + Kafka (reactor-kafka) lets you build reactive, event-driven microservices where APIs can produce to Kafka and stream from Kafka in real time, with non-blocking backpressure support.


Back to blog

Leave a comment