Reactive Programming with Spring Boot
πΉ What is Reactive Programming?
- Reactive Programming is an asynchronous, non-blocking programming paradigm built around the idea of data streams and event-driven systems.
- Instead of executing tasks sequentially and blocking threads (traditional imperative style), reactive systems:
- React to events/data as they arrive.
- Use publish-subscribe models.
- Emphasize backpressure (control over fast producers and slow consumers).
π In short: Itβs about building resilient, scalable applications that can handle high concurrency with fewer resources.
πΉ Why Reactive in Spring Boot?
Traditional Spring MVC (Servlet-based):
- Each request blocks a thread until response is ready.
- Good for I/O-light apps, but struggles at scale when waiting on DB, API calls, etc.
- Reactive Spring (WebFlux-based):
- Built on Project Reactor (implementation of Reactive Streams).
- Uses event-loop model instead of one-thread-per-request.
- Non-blocking I/O β better resource utilization.
- Great when your app makes lots of remote calls (DB, APIs, message brokers).
πΉ Key Components in Spring Reactive
-
Spring WebFlux
- Alternative to Spring MVC for reactive applications.
- Uses Netty (by default) instead of Tomcat/Jetty.
-
Project Reactor (core reactive library in Spring)
- Mono β 0 or 1 result (like Optional/CompletableFuture).
- Flux β 0β¦N results (like Stream).
-
Reactive Streams API
- Interfaces:
Publisher
,Subscriber
,Subscription
,Processor
. - Defines how publishers and subscribers communicate asynchronously.
- Interfaces:
πΉ Example β Reactive REST API with Spring Boot
1. Add Dependency (pom.xml
)
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
2. Reactive Controller
import org.springframework.web.bind.annotation.*;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import java.time.Duration;
@RestController
@RequestMapping("/users")
public class UserController {
// Return a single user (Mono)
@GetMapping("/{id}")
public Mono<String> getUser(@PathVariable String id) {
return Mono.just("User: " + id);
}
// Return multiple users (Flux)
@GetMapping
public Flux<String> getAllUsers() {
return Flux.just("Alice", "Bob", "Charlie")
.delayElements(Duration.ofSeconds(1)); // simulate streaming
}
}
π /users
β streams users one per second (non-blocking).
πΉ Reactive Database Example
Using Spring Data R2DBC (Reactive DB access):
import org.springframework.data.annotation.Id;
import org.springframework.data.relational.core.mapping.Table;
import reactor.core.publisher.Mono;
import reactor.core.publisher.Flux;
import org.springframework.data.repository.reactive.ReactiveCrudRepository;
@Table("users")
class User {
@Id
private Long id;
private String name;
}
// Reactive repository
interface UserRepository extends ReactiveCrudRepository<User, Long> {}
// Service
@Service
public class UserService {
@Autowired private UserRepository repo;
public Mono<User> getUser(Long id) {
return repo.findById(id);
}
public Flux<User> getAllUsers() {
return repo.findAll();
}
}
πΉ When to Use Reactive Programming?
β Use it when:
- High concurrency apps (chat, streaming, IoT).
- Services making many I/O calls (DB, REST APIs, messaging).
- You need real-time streaming (Kafka, WebSockets).
β Avoid if:
- App is mostly CPU-bound.
- Simpler blocking style is sufficient.
- Your team isnβt comfortable with functional-reactive style (steep learning curve).
πΉ Benefits
- High throughput, fewer threads needed.
- Better resource utilization under load.
- Built-in backpressure handling.
- Works well with Kafka, MongoDB (Reactive driver), Cassandra, R2DBC.
β
In short:
Reactive programming with Spring Boot (WebFlux + Project Reactor) lets you build non-blocking, event-driven, highly scalable microservices, perfect for modern cloud-native systems.
Β
πΉ Why WebFlux + Kafka?
-
WebFlux β non-blocking, reactive REST API (Project Reactor:
Mono
/Flux
). - Kafka β high-throughput, distributed event streaming platform.
- Together: build services where API requests trigger events to Kafka and Kafka streams are exposed/reactively to clients.
πΉ Key Tools in Spring Boot
- Spring WebFlux β reactive REST APIs.
-
Spring Kafka or Reactor Kafka β Kafka producer/consumer in reactive style.
-
spring-kafka
= wrapper on Kafka client (not fully reactive). -
reactor-kafka
= fully reactive integration with Project Reactor.
-
πΉ Example: Reactive Producer API (send to Kafka)
1. Dependencies (pom.xml
)
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
<groupId>io.projectreactor.kafka</groupId>
<artifactId>reactor-kafka</artifactId>
<version>1.3.23</version>
</dependency>
2. Reactive Kafka Producer Config (Reactor Kafka
)
import org.apache.kafka.common.serialization.StringSerializer;
import reactor.kafka.sender.KafkaSender;
import reactor.kafka.sender.SenderOptions;
import reactor.kafka.sender.SenderRecord;
import java.util.Map;
@Configuration
public class KafkaProducerConfig {
@Bean
public KafkaSender<String, String> reactiveKafkaSender() {
Map<String, Object> props = Map.of(
"bootstrap.servers", "localhost:9092",
"key.serializer", StringSerializer.class,
"value.serializer", StringSerializer.class,
"acks", "all"
);
SenderOptions<String, String> senderOptions = SenderOptions.create(props);
return KafkaSender.create(senderOptions);
}
}
3. Reactive Controller Producing Events
import org.springframework.web.bind.annotation.*;
import reactor.core.publisher.Mono;
import reactor.kafka.sender.KafkaSender;
import reactor.kafka.sender.SenderRecord;
@RestController
@RequestMapping("/messages")
public class MessageController {
private final KafkaSender<String, String> kafkaSender;
public MessageController(KafkaSender<String, String> kafkaSender) {
this.kafkaSender = kafkaSender;
}
@PostMapping
public Mono<String> sendMessage(@RequestBody String message) {
return kafkaSender.send(Mono.just(
SenderRecord.create("test-topic", null, null, null, message, null)))
.next()
.map(result -> "Message sent: " + message);
}
}
π Calling POST /messages
with a payload pushes the message into Kafka reactively.
πΉ Example: Reactive Kafka Consumer (WebFlux + Kafka Stream)
1. Consumer Config
import org.apache.kafka.common.serialization.StringDeserializer;
import reactor.kafka.receiver.KafkaReceiver;
import reactor.kafka.receiver.ReceiverOptions;
@Configuration
public class KafkaConsumerConfig {
@Bean
public KafkaReceiver<String, String> reactiveKafkaReceiver() {
Map<String, Object> props = Map.of(
"bootstrap.servers", "localhost:9092",
"group.id", "webflux-consumer",
"key.deserializer", StringDeserializer.class,
"value.deserializer", StringDeserializer.class,
"auto.offset.reset", "earliest"
);
ReceiverOptions<String, String> options =
ReceiverOptions.<String, String>create(props)
.subscription(List.of("test-topic"));
return KafkaReceiver.create(options);
}
}
2. Streaming Messages as a Flux Endpoint
import org.springframework.web.bind.annotation.*;
import reactor.core.publisher.Flux;
import reactor.kafka.receiver.KafkaReceiver;
import reactor.kafka.receiver.ReceiverRecord;
@RestController
@RequestMapping("/stream")
public class KafkaStreamController {
private final KafkaReceiver<String, String> receiver;
public KafkaStreamController(KafkaReceiver<String, String> receiver) {
this.receiver = receiver;
}
@GetMapping(produces = "text/event-stream")
public Flux<String> streamMessages() {
return receiver.receive()
.map(ReceiverRecord::value) // extract message payload
.doOnNext(val -> System.out.println("Consumed: " + val));
}
}
π Hitting /stream
gives you a Server-Sent Events (SSE) endpoint that streams Kafka messages reactively to the browser or client.
πΉ Flow Summary
- Client sends REST request β WebFlux Controller β publish to Kafka.
- KafkaConsumer (reactor-kafka) consumes β exposed as a
Flux
stream over SSE/WebSocket. - Entire flow is non-blocking, backpressure-aware, and scalable.
πΉ When to Use WebFlux + Kafka
β
Real-time dashboards (stock prices, IoT sensors, live monitoring).
β
Chat systems / notifications.
β
Event-driven microservices (billing, order tracking, fraud detection).
β
Streaming APIs to clients (web/mobile).
β In short:
Spring WebFlux + Kafka (reactor-kafka) lets you build reactive, event-driven microservices where APIs can produce to Kafka and stream from Kafka in real time, with non-blocking backpressure support.