Explain the role of Eureka in a Spring Boot microservice architecture

1) Test strategy (pyramid + contracts)

  • Unit tests (broadest): pure functions/classes. Fast, deterministic.
  • Component/Service tests: service + real framework (Spring), fake neighbors.
  • Contract tests: consumers & providers agree on HTTP/gRPC/message schemas/behaviors.
  • Integration tests (selective): service + real infra via Testcontainers.
  • End‑to‑End (few): user journeys across multiple services; smoke + happy paths only.
  • Non‑functionals: performance, resilience/chaos, security, data quality, migrations.

Rule of thumb: Lots of unit + component, contracts for edges, a handful of E2E.


2) What to test at each layer (Spring Boot focus)

A) Unit (JUnit 5, Mockito, AssertJ)

  • Domain logic, mappers, validators, policies.

@ExtendWith(MockitoExtension.class)
class PriceServiceTest {
  @Test void appliesDiscount() { /* pure logic, no Spring */ }
}

B) Web/API slice (MockMvc or WebTestClient)

  • Controllers, filters, error handling; no network.

@WebMvcTest(OrdersController.class)
class OrdersControllerTest {
  @Autowired MockMvc mvc;
  @Test void returns201() throws Exception {
    mvc.perform(post("/orders").content("{...}"))
       .andExpect(status().isCreated());
  }
}

C) Contract tests (Pact or Spring Cloud Contract)

  • Consumer‑driven: prevent breaking changes on APIs/events.
  • Produce JSON contracts; provider verifies them in CI.
// Pact snippet (consumer)
uponReceiving('create order').path('/orders').method('POST')
.willRespondWith(status:201, headers:[...'Content-Type':'application/json'])

D) Integration with real dependencies (Testcontainers)

  • Spin up Postgres/Kafka/Redis per test; verify repositories, message handlers, migrations.

@Container static PostgreSQLContainer<?> db = new PostgreSQLContainer<>("postgres:16");
@Testcontainers class OrderRepoIT { /* SpringBootTest with db.getJdbcUrl() */ }

E) Messaging & Sagas

  • Publish/consume with Testcontainers Kafka/RabbitMQ.
  • Assert idempotency, outbox emission, compensations, timeouts.
// Use Awaitility to wait for async effects
await().atMost(5, SECONDS).until(() -> repo.find(orderId).status() == CONFIRMED);

F) End‑to‑End (few tests, prod‑like env)

  • Deploy a small slice (API Gateway + 2–3 services + DBs).
  • Run smoke: create order → pay → ship; assert telemetry & traces exist.

3) Non‑functional testing

Performance (k6/Gatling/JMeter)

  • Define SLIs/SLOs (p95 latency, error rate).
  • Test steady‑state + spike + soak; include DB and cache warmup.

Resilience/Chaos (Toxiproxy, Chaos Mesh, Gremlin)

  • Inject latency, packet loss, dependency outages.
  • Verify timeouts, retries, circuit breakers, and graceful degradation.

Security (OWASP ZAP, dependency/container scans)

  • DAST against staging; assert headers/CORS/auth flows.
  • SAST (SpotBugs), SCA (OWASP Dependency-Check/Snyk), image scans (Trivy).

Data & migrations

  • Flyway/Liquibase migration tests with real DB, rollback paths.
  • Backward/forward compatibility: expand‑contract schema, blue/green DB.

Observability tests

  • Assert logs include traceId, metrics exposed, health checks correct.
  • Synthetic checks after deploy verifying key business metrics/events.

4) Test data & determinism

  • Use builders/factories; keep fixtures minimal and explicit.
  • Clock abstraction (freeze time) and UUID providers for stable assertions.
  • Seed data via SQL scripts or API seeds per test run; isolate with schemas.
  • Clean per test using transactions or container re‑creation.

5) CI/CD wiring

  • On PR: unit + component + contract (consumer) + fast integrations.
  • Nightly: fuller integration/perf smoke.
  • Pre‑prod: provider contract verification, E2E smoke, security scan.
  • Post‑deploy: synthetic checks + canary analysis; auto‑rollback on SLO breach.

6) Tooling cheat‑sheet (Java/Spring)

  • Unit/Component: JUnit 5, Mockito, AssertJ, Spring Boot Test, ArchUnit (architectural rules), PIT (mutation testing).
  • Contracts: Pact JVM / Spring Cloud Contract.
  • Integration: Testcontainers (Postgres, Kafka, Redis, LocalStack).
  • API tests: RestAssured / WebTestClient.
  • Perf: Gatling/k6.
  • Chaos: Toxiproxy, Chaos Mesh (K8s).
  • Security: OWASP ZAP, Trivy, Snyk.
  • Observability: OpenTelemetry SDK + assertions on emitted spans/metrics.

7) Example: minimal pipeline stages

  1. build → unit/component/contract (consumer).
  2. verify‑provider → run provider against stored contracts.
  3. integration → Testcontainers suite (DB + Kafka).
  4. package & scan → jar/image scan.
  5. deploy‑staging → run E2E smoke + perf smoke + ZAP baseline.
  6. prod canary → 10% traffic + SLO guardrails → full rollout.

Anti‑patterns (avoid)

  • 100% reliance on brittle E2E tests.
  • Shared test env with mutable data across pipelines.
  • No contract tests → frequent breaking changes.
  • Tests that hit real third‑party APIs (use WireMock).
Back to blog

Leave a comment

Please note, comments need to be approved before they are published.