Introduction

Welcome to Chapter 14! In this practical project, we’re going to roll up our sleeves and apply everything we’ve learned about Testcontainers to a real-world scenario: building and testing a Java Spring Boot microservice.

Microservices often rely on external dependencies like databases, message brokers, and other services. Testing these interactions is crucial but can be challenging. We want our tests to be realistic, fast, and isolated. This is precisely where Testcontainers shines!

By the end of this chapter, you’ll have a solid understanding of how to:

  • Set up a basic Spring Boot application with database and message broker dependencies.
  • Integrate Testcontainers with JUnit 5 to spin up disposable PostgreSQL and Kafka instances for your tests.
  • Write robust integration tests that ensure your microservice interacts correctly with its external systems.
  • Gain confidence in your microservice’s behavior by testing it in an environment that closely mirrors production.

Ready to build something awesome? Let’s dive in!

Core Concepts: Microservice Integration Testing with Testcontainers

When developing microservices, we aim for loose coupling and independent deployability. However, this doesn’t mean they exist in a vacuum. Each microservice often depends on other components – a database for persistence, a message broker for asynchronous communication, or even other microservices via API calls.

How do you effectively test these interactions?

  • Unit Tests: Great for isolated logic, but don’t verify integration.
  • Mocks/Fakes: Useful for isolating dependencies in unit tests, but they simulate behavior, which might not reflect real-world issues. Did you mock the database behavior exactly right? What if the real database has a subtle difference?
  • In-Memory Databases (e.g., H2): Faster than real databases, but can hide dialect-specific SQL issues or feature differences (e.g., specific JSON functions in PostgreSQL).
  • Testcontainers: This is where Testcontainers becomes invaluable for integration tests. Instead of faking or simulating, it allows you to spin up real instances of your dependencies in Docker containers. This provides:
    • Realism: You’re testing against the exact same database or message broker that will run in production.
    • Isolation: Each test run (or even each test method) can get its own fresh, clean container instance, preventing test pollution and ensuring repeatable results.
    • Speed: Containers are lightweight and start quickly, especially compared to provisioning full VMs.

The Microservice Architecture We’ll Test

For this project, we’ll imagine a simple Product microservice that:

  1. Stores product information in a PostgreSQL database.
  2. Publishes product events (e.g., “product created”) to a Kafka message broker.

Our integration tests will verify that the microservice correctly interacts with both PostgreSQL and Kafka.

Testcontainers for Spring Boot Integration

Spring Boot’s testing framework, especially with JUnit 5, plays very nicely with Testcontainers. You’ll see how we use annotations like @Testcontainers and @Container to manage container lifecycles, and how Spring Boot’s test context can automatically pick up dynamically provided connection details from our running containers.

Let’s get our hands dirty and start building!

Step-by-Step Implementation

We’ll start by setting up a basic Spring Boot project and then incrementally add Testcontainers for our integration tests.

Step 1: Project Setup (Spring Boot)

First, let’s create our Spring Boot project. We’ll use Maven for dependency management.

  1. Generate the Project: Go to the Spring Initializr website (as of 2026-02-14, this is the go-to tool).

    • Project: Maven Project
    • Language: Java
    • Spring Boot: Choose the latest stable version (e.g., 3.2.3 or similar for 2026).
    • Group: com.example
    • Artifact: product-service
    • Name: product-service
    • Package Name: com.example.productservice
    • Java: 17 or higher (latest LTS)
    • Dependencies: Add Spring Web, Spring Data JPA, PostgreSQL Driver, Spring for Apache Kafka.

    Click “Generate” and download the ZIP file. Unzip it into a directory of your choice.

  2. Inspect pom.xml: Open the pom.xml file. You should see entries for the dependencies we selected. Notice the spring-boot-starter-test dependency, which includes JUnit 5 and other testing utilities.

    <!-- product-service/pom.xml (snippet) -->
    <?xml version="1.0" encoding="UTF-8"?>
    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
        <modelVersion>4.0.0</modelVersion>
        <parent>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-parent</artifactId>
            <version>3.2.3</version> <!-- Adjust to your generated version -->
            <relativePath/> <!-- lookup parent from repository -->
        </parent>
        <groupId>com.example</groupId>
        <artifactId>product-service</artifactId>
        <version>0.0.1-SNAPSHOT</version>
        <name>product-service</name>
        <description>Demo project for Spring Boot and Testcontainers</description>
        <properties>
            <java.version>17</java.version>
        </properties>
        <dependencies>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-web</artifactId>
            </dependency>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-data-jpa</artifactId>
            </dependency>
            <dependency>
                <groupId>org.postgresql</groupId>
                <artifactId>postgresql</artifactId>
                <scope>runtime</scope>
            </dependency>
            <dependency>
                <groupId>org.springframework.kafka</groupId>
                <artifactId>spring-kafka</artifactId>
            </dependency>
    
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-test</artifactId>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>org.springframework.kafka</groupId>
                <artifactId>spring-kafka-test</artifactId>
                <scope>test</scope>
            </dependency>
        </dependencies>
    
        <build>
            <plugins>
                <plugin>
                    <groupId>org.springframework.boot</groupId>
                    <artifactId>spring-boot-maven-plugin</artifactId>
                </plugin>
            </plugins>
        </build>
    
    </project>
    

    This pom.xml sets up our basic Spring Boot application. We have dependencies for web capabilities, JPA for database interaction, PostgreSQL for the actual database driver, and Spring Kafka for message handling. Crucially, spring-boot-starter-test brings in JUnit 5, Mockito, and other testing tools, and spring-kafka-test adds utilities for testing Kafka interactions.

  3. Create Domain and Repository: Let’s define a simple Product entity and a JPA repository to interact with it.

    Create src/main/java/com/example/productservice/domain/Product.java:

    package com.example.productservice.domain;
    
    import jakarta.persistence.Entity;
    import jakarta.persistence.GeneratedValue;
    import jakarta.persistence.GenerationType;
    import jakarta.persistence.Id;
    import java.math.BigDecimal;
    import java.util.Objects;
    
    @Entity
    public class Product {
    
        @Id
        @GeneratedValue(strategy = GenerationType.IDENTITY)
        private Long id;
        private String name;
        private String description;
        private BigDecimal price;
    
        // Default constructor for JPA
        protected Product() {
        }
    
        public Product(String name, String description, BigDecimal price) {
            this.name = name;
            this.description = description;
            this.price = price;
        }
    
        // Getters and Setters
        public Long getId() {
            return id;
        }
    
        public void setId(Long id) {
            this.id = id;
        }
    
        public String getName() {
            return name;
        }
    
        public void setName(String name) {
            this.name = name;
        }
    
        public String getDescription() {
            return description;
        }
    
        public void setDescription(String description) {
            this.description = description;
        }
    
        public BigDecimal getPrice() {
            return price;
        }
    
        public void setPrice(BigDecimal price) {
            this.price = price;
        }
    
        @Override
        public boolean equals(Object o) {
            if (this == o) return true;
            if (o == null || getClass() != o.getClass()) return false;
            Product product = (Product) o;
            return Objects.equals(id, product.id);
        }
    
        @Override
        public int hashCode() {
            return Objects.hash(id);
        }
    
        @Override
        public String toString() {
            return "Product{" +
                   "id=" + id +
                   ", name='" + name + '\'' +
                   ", description='" + description + '\'' +
                   ", price=" + price +
                   '}';
        }
    }
    

    This is a standard JPA entity. Notice the @Entity and @Id annotations.

    Next, create src/main/java/com/example/productservice/repository/ProductRepository.java:

    package com.example.productservice.repository;
    
    import com.example.productservice.domain.Product;
    import org.springframework.data.jpa.repository.JpaRepository;
    import org.springframework.stereotype.Repository;
    
    @Repository
    public interface ProductRepository extends JpaRepository<Product, Long> {
    }
    

    A simple Spring Data JPA repository. Spring will automatically provide implementations for common CRUD operations.

    Now, let’s enable JPA and configure our application for a database. Open src/main/resources/application.properties and add some basic H2 database configuration for local development. We’ll override this with Testcontainers for tests.

    # application.properties
    spring.jpa.hibernate.ddl-auto=update
    spring.jpa.show-sql=true
    spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.PostgreSQLDialect
    
    # H2 In-memory database settings (for quick local dev, not for tests with Testcontainers)
    # spring.datasource.url=jdbc:h2:mem:testdb
    # spring.datasource.driver-class-name=org.h2.Driver
    # spring.datasource.username=sa
    # spring.datasource.password=
    

    We’ve set ddl-auto to update for development, which will create/update tables automatically. The PostgreSQLDialect is important. I’ve commented out the H2 details as we will use a real PostgreSQL database via Testcontainers for our tests.

Step 2: Adding Testcontainers Dependencies

Now let’s add the Testcontainers library to our project.

Edit pom.xml and add the following dependencies within the <dependencies> section, typically grouped with spring-boot-starter-test:

        <!-- product-service/pom.xml (snippet) -->
        <dependency>
            <groupId>org.testcontainers</groupId>
            <artifactId>junit-jupiter</artifactId>
            <version>1.19.4</version> <!-- As of 2026-02-14, this is a recent stable version -->
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.testcontainers</groupId>
            <artifactId>postgresql</artifactId>
            <version>1.19.4</version> <!-- Must match junit-jupiter version -->
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.testcontainers</groupId>
            <artifactId>kafka</artifactId>
            <version>1.19.4</version> <!-- Must match junit-jupiter version -->
            <scope>test</scope>
        </dependency>
        <!-- Spring Boot 3.1+ provides @ServiceConnection for easier Testcontainers integration -->
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-testcontainers</artifactId>
            <scope>test</scope>
        </dependency>

Explanation:

  • junit-jupiter: This is the core Testcontainers integration for JUnit 5. It provides annotations and utilities to manage container lifecycles within your tests.
  • postgresql: This module provides a specialized PostgreSQLContainer class, which simplifies setting up a PostgreSQL instance.
  • kafka: Similarly, this module provides KafkaContainer for a Kafka message broker.
  • spring-boot-testcontainers: New in Spring Boot 3.1+, this dependency makes connecting your Spring Boot application to Testcontainers even easier using annotations like @ServiceConnection and @ContainerConnection or by simply auto-detecting common containers. We’ll leverage this modern approach.

The version 1.19.4 for Testcontainers is a recent stable version as of 2026-02-14. Always use the latest stable version for your projects. You can check the official Testcontainers Java releases for the most up-to-date information.

After adding these dependencies, run mvn clean install (or gradle build if using Gradle) to ensure they are downloaded and the project compiles.

Step 3: Database Integration Test (PostgreSQL)

Let’s write our first integration test for the ProductRepository, making sure it can connect to and interact with a real PostgreSQL database.

Create src/test/java/com/example/productservice/repository/ProductRepositoryIntegrationTest.java:

package com.example.productservice.repository;

import com.example.productservice.domain.Product;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase;
import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest;
import org.springframework.boot.testcontainers.service.connection.ServiceConnection; // Spring Boot 3.1+
import org.springframework.test.context.DynamicPropertySource;
import org.springframework.test.context.DynamicPropertyRegistry;
import org.testcontainers.containers.PostgreSQLContainer;
import org.testcontainers.junit.jupiter.Container;
import org.testcontainers.junit.jupiter.Testcontainers;

import java.math.BigDecimal;
import java.util.Optional;

import static org.assertj.core.api.Assertions.assertThat;

@Testcontainers // 1. Activates Testcontainers for this test class
@DataJpaTest // 2. Configures slice test for JPA components
@AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE) // 3. Prevents Spring from replacing our DB
class ProductRepositoryIntegrationTest {

    @Container // 4. Declares a Testcontainers container
    @ServiceConnection // 5. Spring Boot 3.1+ specific annotation for auto-configuration
    static PostgreSQLContainer<?> postgres = new PostgreSQLContainer<>("postgres:16.2"); // 6. Uses PostgreSQL image

    @Autowired
    private ProductRepository productRepository;

    // Optional: For Spring Boot versions < 3.1, or if @ServiceConnection doesn't cover your needs.
    // @DynamicPropertySource // 7. Alternative: Programmatically set Spring properties
    // static void configureProperties(DynamicPropertyRegistry registry) {
    //     registry.add("spring.datasource.url", postgres::getJdbcUrl);
    //     registry.add("spring.datasource.username", postgres::getUsername);
    //     registry.add("spring.datasource.password", postgres::getPassword);
    // }

    @Test
    void shouldSaveAndFindProduct() {
        // Given
        Product newProduct = new Product("Laptop", "Powerful new laptop", BigDecimal.valueOf(1200.00));

        // When
        Product savedProduct = productRepository.save(newProduct);

        // Then
        assertThat(savedProduct).isNotNull();
        assertThat(savedProduct.getId()).isNotNull();
        assertThat(savedProduct.getName()).isEqualTo("Laptop");

        Optional<Product> foundProduct = productRepository.findById(savedProduct.getId());
        assertThat(foundProduct).isPresent();
        assertThat(foundProduct.get().getName()).isEqualTo("Laptop");
    }

    @Test
    void shouldDeleteProduct() {
        // Given
        Product productToDelete = productRepository.save(new Product("Mouse", "Ergonomic mouse", BigDecimal.valueOf(50.00)));
        assertThat(productToDelete.getId()).isNotNull();

        // When
        productRepository.deleteById(productToDelete.getId());

        // Then
        Optional<Product> foundProduct = productRepository.findById(productToDelete.getId());
        assertThat(foundProduct).isNotPresent();
    }
}

Let’s break down this powerful test step-by-step:

  1. @Testcontainers: This annotation from Testcontainers JUnit Jupiter module tells JUnit to look for and manage @Container fields. It ensures that any containers declared will be started before tests run and stopped after all tests in the class complete.
  2. @DataJpaTest: This Spring Boot annotation is fantastic for testing JPA components. It auto-configures an in-memory database by default and scans for JPA entities and repositories.
  3. @AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE): This is crucial! By default, @DataJpaTest tries to replace any configured data source with an in-memory H2 database. We explicitly tell it not to do this, so we can use our Testcontainers-managed PostgreSQL instance instead.
  4. @Container: This Testcontainers annotation marks the postgres field as a container that needs to be managed. Because it’s a static field, Testcontainers will start this container once for all tests in this class and reuse it. This is a good practice for performance when the container setup is costly.
  5. @ServiceConnection: This is a powerful feature introduced in Spring Boot 3.1. When applied to a static Testcontainers container field (like our PostgreSQLContainer), Spring Boot automatically detects the container and its connection properties (like JDBC URL, username, password) and injects them into the Spring application context. This means you don’t need to manually configure spring.datasource.url, etc., making the setup much cleaner!
  6. static PostgreSQLContainer<?> postgres = new PostgreSQLContainer<>("postgres:16.2");: We declare a static field of type PostgreSQLContainer. This is a specialized container type provided by the Testcontainers PostgreSQL module. We pass "postgres:16.2" as the Docker image to use. This will pull the PostgreSQL 16.2 image if it’s not already cached on your machine. Using a specific version like 16.2 is a best practice for reproducible tests. The <?> is a wildcard for the type of the container, indicating it’s a generic container.
  7. @DynamicPropertySource (Commented Out Alternative): Before Spring Boot 3.1 and @ServiceConnection, you would use @DynamicPropertySource to programmatically provide the container’s connection details to Spring’s Environment. While @ServiceConnection is preferred for modern Spring Boot applications, @DynamicPropertySource remains a powerful and flexible option for scenarios where @ServiceConnection might not apply or for older Spring Boot versions.

When you run this test, Testcontainers will:

  • Start a PostgreSQL 16.2 Docker container.
  • Spring Boot, thanks to @ServiceConnection, will configure the application’s data source to connect to this running container.
  • Your ProductRepository will then interact with this real PostgreSQL instance.
  • After the tests complete, the PostgreSQL container will be stopped and removed.

Try running mvn test from your project root, or execute the test class from your IDE. You should see Docker pulling the postgres:16.2 image if it’s not local, and then your tests should pass!

Step 4: Message Broker Integration Test (Kafka)

Now, let’s add Kafka to the mix. We’ll create a simple Kafka producer and consumer within our application and then test their interaction using a Testcontainers-managed Kafka broker.

  1. Create Kafka Configuration: First, we need some configuration for our Kafka client in Spring Boot. Create src/main/java/com/example/productservice/config/KafkaConfig.java:

    package com.example.productservice.config;
    
    import org.apache.kafka.clients.admin.NewTopic;
    import org.springframework.beans.factory.annotation.Value;
    import org.springframework.context.annotation.Bean;
    import org.springframework.context.annotation.Configuration;
    import org.springframework.kafka.core.KafkaTemplate;
    import org.springframework.kafka.core.ProducerFactory;
    import org.springframework.kafka.annotation.EnableKafka;
    import org.springframework.kafka.config.TopicBuilder;
    
    @Configuration
    @EnableKafka // Enables Kafka listener infrastructure
    public class KafkaConfig {
    
        @Value("${app.kafka.product-topic}")
        private String productTopic;
    
        // Bean to create the Kafka topic if it doesn't exist
        @Bean
        public NewTopic productTopic() {
            // Topic with 1 partition and 1 replica
            return TopicBuilder.name(productTopic)
                    .partitions(1)
                    .replicas(1)
                    .build();
        }
    
        // We don't need to explicitly configure ProducerFactory/KafkaTemplate here,
        // as Spring Boot auto-configures them if spring.kafka.bootstrap-servers is set.
        // But for demonstration, if you needed custom configuration:
        // @Bean
        // public KafkaTemplate<String, String> kafkaTemplate(ProducerFactory<String, String> producerFactory) {
        //     return new KafkaTemplate<>(producerFactory);
        // }
    }
    

    And add the topic name to application.properties:

    # application.properties
    # ... other properties ...
    app.kafka.product-topic=product-events
    
  2. Create a Kafka Producer Service: This service will send messages to our Kafka topic. Create src/main/java/com/example/productservice/service/ProductEventProducer.java:

    package com.example.productservice.service;
    
    import org.slf4j.Logger;
    import org.slf4j.LoggerFactory;
    import org.springframework.beans.factory.annotation.Value;
    import org.springframework.kafka.core.KafkaTemplate;
    import org.springframework.stereotype.Service;
    
    @Service
    public class ProductEventProducer {
    
        private static final Logger log = LoggerFactory.getLogger(ProductEventProducer.class);
        private final KafkaTemplate<String, String> kafkaTemplate;
    
        @Value("${app.kafka.product-topic}")
        private String productTopic;
    
        public ProductEventProducer(KafkaTemplate<String, String> kafkaTemplate) {
            this.kafkaTemplate = kafkaTemplate;
        }
    
        public void sendProductCreatedEvent(String productId, String productName) {
            String message = String.format("Product Created: ID=%s, Name=%s", productId, productName);
            log.info("Sending message to topic {}: {}", productTopic, message);
            kafkaTemplate.send(productTopic, productId, message);
        }
    }
    

    This simple service injects KafkaTemplate and sends a string message to our product-events topic.

  3. Create a Kafka Consumer (for testing verification): For our tests, we’ll need a way to verify that messages were actually sent to Kafka. A simple test consumer will help. This consumer won’t be part of the main application flow, but rather a utility for tests.

    Create src/test/java/com/example/productservice/kafka/TestKafkaConsumer.java:

    package com.example.productservice.kafka;
    
    import org.apache.kafka.clients.consumer.ConsumerRecord;
    import org.springframework.kafka.annotation.KafkaListener;
    import org.springframework.stereotype.Component;
    
    import java.util.ArrayList;
    import java.util.Collections;
    import java.util.List;
    import java.util.concurrent.BlockingQueue;
    import java.util.concurrent.LinkedBlockingQueue;
    import java.util.concurrent.TimeUnit;
    
    @Component
    public class TestKafkaConsumer {
    
        // A thread-safe queue to store received messages for inspection in tests
        private final BlockingQueue<ConsumerRecord<String, String>> records = new LinkedBlockingQueue<>();
    
        @KafkaListener(topics = "${app.kafka.product-topic}", groupId = "test-group", autoStartup = "false") // autoStartup=false so we can manually control in tests
        public void listen(ConsumerRecord<String, String> record) {
            records.add(record);
            System.out.println("Received by TestKafkaConsumer: " + record.value());
        }
    
        public void clear() {
            records.clear();
        }
    
        public List<ConsumerRecord<String, String>> getRecords(int count, long timeoutSeconds) throws InterruptedException {
            List<ConsumerRecord<String, String>> receivedRecords = new ArrayList<>();
            for (int i = 0; i < count; i++) {
                ConsumerRecord<String, String> record = records.poll(timeoutSeconds, TimeUnit.SECONDS);
                if (record != null) {
                    receivedRecords.add(record);
                } else {
                    System.out.println("Timeout waiting for Kafka record " + (i+1));
                    break;
                }
            }
            return Collections.unmodifiableList(receivedRecords);
        }
    }
    

    Important: Notice autoStartup = "false" on the @KafkaListener. This prevents the listener from starting automatically with the application context. We will manually start it in our tests when needed. This is a good practice for test-specific consumers, allowing you to control when they begin consuming.

  4. Kafka Integration Test: Now, let’s write a test that uses KafkaContainer and our producer/consumer.

    Create src/test/java/com/example/productservice/service/ProductEventProducerIntegrationTest.java:

    package com.example.productservice.service;
    
    import com.example.productservice.kafka.TestKafkaConsumer;
    import org.apache.kafka.clients.consumer.ConsumerRecord;
    import org.junit.jupiter.api.BeforeEach;
    import org.junit.jupiter.api.Test;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.test.context.SpringBootTest;
    import org.springframework.boot.testcontainers.service.connection.ServiceConnection;
    import org.springframework.kafka.test.context.EmbeddedKafka;
    import org.springframework.test.context.ActiveProfiles;
    import org.springframework.test.context.DynamicPropertyRegistry;
    import org.springframework.test.context.DynamicPropertySource;
    import org.testcontainers.containers.KafkaContainer;
    import org.testcontainers.junit.jupiter.Container;
    import org.testcontainers.junit.jupiter.Testcontainers;
    import org.testcontainers.utility.DockerImageName;
    
    import java.util.List;
    
    import static org.assertj.core.api.Assertions.assertThat;
    import static org.awaitility.Awaitility.await;
    import java.util.concurrent.TimeUnit;
    
    // We use @SpringBootTest to bring up the full application context including Kafka components
    @SpringBootTest
    @Testcontainers
    @ActiveProfiles("test") // Ensures test-specific properties can be loaded
    class ProductEventProducerIntegrationTest {
    
        @Container
        @ServiceConnection // Automatically connects Spring Boot to this Kafka container
        static KafkaContainer kafka = new KafkaContainer(DockerImageName.parse("confluentinc/cp-kafka:7.5.3")); // Use a specific version
    
        @Autowired
        private ProductEventProducer productEventProducer;
    
        @Autowired
        private TestKafkaConsumer testKafkaConsumer; // Our test utility consumer
    
        @BeforeEach
        void setUp() {
            testKafkaConsumer.clear(); // Clear messages before each test
            // Manually start the Kafka listener for this test context
            // Note: In a real app, listeners are managed by Spring. Here, for test control.
            // You might need to inject a KafkaListenerEndpointRegistry and manage its lifecycle
            // depending on your Spring Kafka setup. For this simple test, we assume it's part of the context.
        }
    
        // We can add DynamicPropertySource as a fallback or for custom properties
        @DynamicPropertySource
        static void kafkaProperties(DynamicPropertyRegistry registry) {
            // Spring Boot's @ServiceConnection usually handles this for common Kafka properties
            // but you might need this for specific custom properties or older Spring versions.
            registry.add("spring.kafka.bootstrap-servers", kafka::getBootstrapServers);
            registry.add("spring.kafka.consumer.group-id", () -> "test-group"); // Ensure consumer group is set for our TestKafkaConsumer
        }
    
        @Test
        void shouldSendProductCreatedEventToKafka() throws InterruptedException {
            // Given
            String productId = "PROD-001";
            String productName = "Test Product";
            testKafkaConsumer.getRecords(0, 0); // Ensure consumer is ready and clear
    
            // When
            productEventProducer.sendProductCreatedEvent(productId, productName);
    
            // Then
            // Wait for the consumer to receive the message
            List<ConsumerRecord<String, String>> receivedRecords = testKafkaConsumer.getRecords(1, 10); // Wait up to 10 seconds
    
            assertThat(receivedRecords).hasSize(1);
            ConsumerRecord<String, String> record = receivedRecords.get(0);
            assertThat(record.key()).isEqualTo(productId);
            assertThat(record.value()).contains("Product Created");
            assertThat(record.value()).contains(productName);
        }
    }
    

Breaking down the Kafka test:

  • @SpringBootTest: We use this to load the full Spring application context, including our Kafka producer and listener configurations.
  • @Container static KafkaContainer kafka = new KafkaContainer(...): Similar to PostgreSQL, we declare a static Kafka container. We use DockerImageName.parse("confluentinc/cp-kafka:7.5.3") to specify the Kafka Docker image. Confluent provides robust Kafka images, and 7.5.3 is chosen as a stable version as of 2026.
  • @ServiceConnection: Again, this Spring Boot 3.1+ annotation automatically configures spring.kafka.bootstrap-servers and other necessary properties based on the running KafkaContainer.
  • @DynamicPropertySource: We manually add spring.kafka.consumer.group-id to ensure our TestKafkaConsumer is part of a known group. While @ServiceConnection handles bootstrap-servers, custom properties often still require DynamicPropertySource.
  • @Autowired ProductEventProducer: We inject our producer service to trigger message sending.
  • @Autowired TestKafkaConsumer: We inject our custom test consumer to inspect received messages.
  • @BeforeEach void setUp(): We clear the consumer’s message queue before each test to ensure test isolation.
  • testKafkaConsumer.getRecords(1, 10): This utility method from our TestKafkaConsumer is crucial. It waits up to 10 seconds for 1 record to arrive. Message consumption in Kafka is asynchronous, so we need to wait for messages to be processed. For more advanced waiting, you could use Awaitility (as imported in the example).

Run this test, and you’ll see Testcontainers spin up a Kafka container, Spring Boot connect to it, and your producer successfully send a message that your test consumer picks up!

Step 5: Full Microservice Integration Test (Controller/Service with DB & Kafka)

Finally, let’s create a full integration test that covers an end-to-end flow: an HTTP request comes in, the service interacts with the database, and publishes a message to Kafka.

  1. Create Service and Controller: First, let’s flesh out our application logic.

    Create src/main/java/com/example/productservice/service/ProductService.java:

    package com.example.productservice.service;
    
    import com.example.productservice.domain.Product;
    import com.example.productservice.repository.ProductRepository;
    import org.springframework.stereotype.Service;
    import org.springframework.transaction.annotation.Transactional;
    
    import java.math.BigDecimal;
    import java.util.List;
    import java.util.Optional;
    
    @Service
    public class ProductService {
    
        private final ProductRepository productRepository;
        private final ProductEventProducer productEventProducer;
    
        public ProductService(ProductRepository productRepository, ProductEventProducer productEventProducer) {
            this.productRepository = productRepository;
            this.productEventProducer = productEventProducer;
        }
    
        @Transactional
        public Product createProduct(String name, String description, BigDecimal price) {
            Product product = new Product(name, description, price);
            Product savedProduct = productRepository.save(product);
            // After saving, publish an event to Kafka
            productEventProducer.sendProductCreatedEvent(String.valueOf(savedProduct.getId()), savedProduct.getName());
            return savedProduct;
        }
    
        @Transactional(readOnly = true)
        public Optional<Product> getProductById(Long id) {
            return productRepository.findById(id);
        }
    
        @Transactional(readOnly = true)
        public List<Product> getAllProducts() {
            return productRepository.findAll();
        }
    }
    

    This service now uses both ProductRepository and ProductEventProducer.

    Now, create a REST controller to expose these operations. Create src/main/java/com/example/productservice/web/ProductController.java:

    package com.example.productservice.web;
    
    import com.example.productservice.domain.Product;
    import com.example.productservice.service.ProductService;
    import org.springframework.http.HttpStatus;
    import org.springframework.http.ResponseEntity;
    import org.springframework.web.bind.annotation.*;
    
    import java.math.BigDecimal;
    import java.util.List;
    import java.util.Map;
    
    @RestController
    @RequestMapping("/api/products")
    public class ProductController {
    
        private final ProductService productService;
    
        public ProductController(ProductService productService) {
            this.productService = productService;
        }
    
        @PostMapping
        public ResponseEntity<Product> createProduct(@RequestBody Map<String, Object> productRequest) {
            String name = (String) productRequest.get("name");
            String description = (String) productRequest.get("description");
            BigDecimal price = new BigDecimal(productRequest.get("price").toString());
    
            Product product = productService.createProduct(name, description, price);
            return new ResponseEntity<>(product, HttpStatus.CREATED);
        }
    
        @GetMapping("/{id}")
        public ResponseEntity<Product> getProductById(@PathVariable Long id) {
            return productService.getProductById(id)
                    .map(product -> new ResponseEntity<>(product, HttpStatus.OK))
                    .orElse(new ResponseEntity<>(HttpStatus.NOT_FOUND));
        }
    
        @GetMapping
        public ResponseEntity<List<Product>> getAllProducts() {
            List<Product> products = productService.getAllProducts();
            return new ResponseEntity<>(products, HttpStatus.OK);
        }
    }
    

    A simple REST controller with endpoints for creating, retrieving by ID, and listing products.

  2. Full Application Integration Test: Now, let’s write an end-to-end test using both PostgreSQL and Kafka containers.

    Create src/test/java/com/example/productservice/web/ProductControllerIntegrationTest.java:

    package com.example.productservice.web;
    
    import com.example.productservice.domain.Product;
    import com.example.productservice.kafka.TestKafkaConsumer;
    import com.fasterxml.jackson.databind.ObjectMapper;
    import org.apache.kafka.clients.consumer.ConsumerRecord;
    import org.junit.jupiter.api.BeforeEach;
    import org.junit.jupiter.api.Test;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.test.context.SpringBootTest;
    import org.springframework.boot.test.web.client.TestRestTemplate;
    import org.springframework.boot.test.web.server.LocalServerPort;
    import org.springframework.boot.testcontainers.service.connection.ServiceConnection;
    import org.springframework.http.HttpEntity;
    import org.springframework.http.HttpHeaders;
    import org.springframework.http.MediaType;
    import org.testcontainers.containers.KafkaContainer;
    import org.testcontainers.containers.PostgreSQLContainer;
    import org.testcontainers.junit.jupiter.Container;
    import org.testcontainers.junit.jupiter.Testcontainers;
    import org.testcontainers.utility.DockerImageName;
    
    import java.math.BigDecimal;
    import java.util.List;
    import java.util.Map;
    import java.util.concurrent.TimeUnit;
    
    import static org.assertj.core.api.Assertions.assertThat;
    import static org.awaitility.Awaitility.await;
    
    @SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT) // 1. Run on a random port
    @Testcontainers
    class ProductControllerIntegrationTest {
    
        // 2. Declare and manage both containers
        @Container
        @ServiceConnection
        static PostgreSQLContainer<?> postgres = new PostgreSQLContainer<>("postgres:16.2");
    
        @Container
        @ServiceConnection
        static KafkaContainer kafka = new KafkaContainer(DockerImageName.parse("confluentinc/cp-kafka:7.5.3"));
    
        @LocalServerPort // 3. Inject the random port for our HTTP client
        private int port;
    
        @Autowired
        private TestRestTemplate restTemplate; // 4. Spring Boot's convenient HTTP client for tests
    
        @Autowired
        private TestKafkaConsumer testKafkaConsumer; // Our utility Kafka consumer
    
        @Autowired
        private ObjectMapper objectMapper; // For converting JSON responses
    
        @BeforeEach
        void setUp() {
            testKafkaConsumer.clear(); // Clear Kafka messages before each test
        }
    
        @Test
        void shouldCreateProductAndPublishEvent() throws Exception {
            // Given
            String baseUrl = "http://localhost:" + port + "/api/products";
            HttpHeaders headers = new HttpHeaders();
            headers.setContentType(MediaType.APPLICATION_JSON);
    
            Map<String, Object> productRequest = Map.of(
                    "name", "Wireless Headset",
                    "description", "Noise-cancelling, comfortable headset",
                    "price", BigDecimal.valueOf(199.99)
            );
            HttpEntity<Map<String, Object>> request = new HttpEntity<>(productRequest, headers);
    
            // When
            ResponseEntity<Product> response = restTemplate.postForEntity(baseUrl, request, Product.class);
    
            // Then - Verify HTTP response
            assertThat(response.getStatusCode().is2xxSuccessful()).isTrue();
            Product createdProduct = response.getBody();
            assertThat(createdProduct).isNotNull();
            assertThat(createdProduct.getId()).isNotNull();
            assertThat(createdProduct.getName()).isEqualTo("Wireless Headset");
            assertThat(createdProduct.getPrice()).isEqualByComparingTo(BigDecimal.valueOf(199.99));
    
            // Then - Verify database interaction (optional, but good for full coverage)
            // You might query the database directly or via another API call
            ResponseEntity<Product> getResponse = restTemplate.getForEntity(baseUrl + "/" + createdProduct.getId(), Product.class);
            assertThat(getResponse.getStatusCode().is2xxSuccessful()).isTrue();
            assertThat(getResponse.getBody()).isEqualTo(createdProduct); // Compares by ID due to equals()
    
            // Then - Verify Kafka message
            // Use Awaitility for more robust asynchronous waiting
            await().atMost(10, TimeUnit.SECONDS).untilAsserted(() -> {
                List<ConsumerRecord<String, String>> receivedRecords = testKafkaConsumer.getRecords(1, 1); // Poll for 1 record, 1 sec timeout each try
                assertThat(receivedRecords).hasSize(1);
                ConsumerRecord<String, String> record = receivedRecords.get(0);
                assertThat(record.key()).isEqualTo(String.valueOf(createdProduct.getId()));
                assertThat(record.value()).contains("Product Created");
                assertThat(record.value()).contains(createdProduct.getName());
            });
        }
    }
    

Breakdown of the Full Integration Test:

  1. @SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT): This annotation tells Spring Boot to load the entire application context and start a real web server on a random available port. This allows us to make actual HTTP requests to our controller.
  2. @Container for both PostgreSQL and Kafka: Both containers are declared static and annotated with @ServiceConnection. This means Testcontainers will start both, and Spring Boot will automatically configure the application’s data source and Kafka client to point to these dynamically running containers.
  3. @LocalServerPort: Spring injects the actual random port the test server started on, which we use to construct our request URLs.
  4. @Autowired TestRestTemplate: Spring Boot provides this convenient HTTP client specifically for integration tests, making it easy to send requests and assert responses.
  5. Test Flow:
    • We construct an HTTP POST request to create a new product.
    • We assert the HTTP response (status code, body).
    • We optionally make another HTTP GET request to verify the product was saved to the database (and retrieved correctly). This implicitly tests the ProductRepository and its connection to the PostgreSQL container.
    • Crucially, we then use Awaitility and our TestKafkaConsumer to wait for and verify that a “Product Created” event was indeed published to the Kafka topic. This confirms the ProductEventProducer correctly interacted with the Kafka container.

This single test now validates the entire vertical slice of our microservice: HTTP layer -> Service layer -> JPA/Database interaction -> Kafka interaction. All powered by real external dependencies managed by Testcontainers.

Mini-Challenge

You’ve built a robust integration test! Now, for a small challenge to solidify your understanding:

Challenge: Extend the ProductControllerIntegrationTest to include a scenario where you list all products.

  1. Before making the GET request, create a couple of products using the POST endpoint (as shown in the shouldCreateProductAndPublishEvent test).
  2. Then, make a GET request to /api/products (without an ID).
  3. Assert that the response contains the two products you created.

Hint:

  • You’ll need to modify the shouldCreateProductAndPublishEvent test or create a new test method.
  • The restTemplate.getForEntity(url, List.class) method can be tricky with generic types. Consider using restTemplate.exchange(url, HttpMethod.GET, null, new ParameterizedTypeReference<List<Product>>() {}) for better type safety, or simplify by just checking the size and a few properties of the returned list.
  • Remember to clear the database (if necessary, though Spring Boot’s @DataJpaTest usually cleans up between tests if using an embedded DB, for Testcontainers you might need to add a @Transactional annotation to tests or a @AfterEach method that clears repositories). For @SpringBootTest, Spring doesn’t automatically rollback transactions. You could add @Transactional to the test method, or simply productRepository.deleteAll() in a @AfterEach method if you had an injected repository. For this challenge, simply check the list size; it’s fine if previous products from other tests are still there for a quick check.

What to Observe/Learn:

  • How to chain multiple API calls in an integration test.
  • Handling collections in HTTP responses from Spring’s TestRestTemplate.
  • Reinforcing the end-to-end flow.

Common Pitfalls & Troubleshooting

Even with powerful tools like Testcontainers, you might encounter issues. Here are some common ones and how to approach them:

  1. Docker Daemon Not Running:

    • Symptom: You’ll see an error message like “Could not connect to Docker daemon” or “Cannot connect to the Docker daemon at unix:///var/run/docker.sock”.
    • Fix: Ensure Docker Desktop (on Windows/macOS) or the Docker service (on Linux) is running before you execute your tests.
  2. Image Pull Failures:

    • Symptom: “Error pulling image: postgres:16.2” or similar, often due to network issues or incorrect image names.
    • Fix: Check your internet connection. Verify the image name and tag are correct (e.g., postgres:16.2 vs. postgresql:latest). You can try pulling the image manually using docker pull postgres:16.2 to see the exact error.
  3. Container Startup Timeouts:

    • Symptom: “Container startup failed: GenericContainer was not healthy within 60 seconds.” (or similar). Testcontainers waits for a container to become “ready” (e.g., a database port is open, a specific log message appears).
    • Fix:
      • Increase Timeout: If your machine is slow or the container is genuinely heavy, you can increase the startup timeout: new PostgreSQLContainer<>("postgres:16.2").withStartupTimeout(Duration.ofSeconds(120)).
      • Check Logs: Look at the container logs (Testcontainers usually prints them on failure) to understand why it’s not starting up. Is there an configuration error inside the container?
      • Resource Limits: Ensure your Docker daemon has enough CPU and memory allocated, especially if running multiple containers.
  4. Spring Boot Context Not Connecting to Containers:

    • Symptom: “Datasource connection refused,” “Kafka broker unreachable.” This means Spring Boot isn’t picking up the dynamic connection details from Testcontainers.
    • Fix:
      • Spring Boot 3.1+ (@ServiceConnection): Double-check that @ServiceConnection is correctly applied to your static @Container fields. Ensure you have the spring-boot-testcontainers dependency.
      • Older Spring Boot / Custom Properties (@DynamicPropertySource): Verify that your @DynamicPropertySource method correctly adds the properties (e.g., spring.datasource.url, spring.kafka.bootstrap-servers) using the container’s methods (e.g., postgres::getJdbcUrl).
      • Property Overrides: Ensure no other application-test.properties or similar files are inadvertently overriding the Testcontainers-provided properties.
  5. Kafka Message Not Received in Test:

    • Symptom: Your assertion for Kafka messages fails, or the TestKafkaConsumer times out.
    • Fix:
      • Asynchronous Nature: Remember Kafka is asynchronous. You must implement a wait strategy (like testKafkaConsumer.getRecords() or Awaitility). Don’t just assert immediately after sending.
      • Topic Name/Consumer Group: Verify your producer is sending to the correct topic name and your consumer is listening to the same topic. Ensure the consumer has a unique group-id for your tests.
      • Listener Startup: If you disabled autoStartup for your test consumer, ensure you manually start it or manage its lifecycle correctly within your test setup.

Summary

Fantastic work! You’ve just built a fully functional integration test suite for a Spring Boot microservice, leveraging the power of Testcontainers to provide real, disposable dependencies.

Here are the key takeaways from this chapter:

  • Realism is Key: Testcontainers allows you to test your microservices against actual instances of databases (PostgreSQL) and message brokers (Kafka), ensuring higher fidelity tests than mocks or in-memory fakes.
  • Seamless Spring Boot Integration: With Spring Boot 3.1+ and the spring-boot-testcontainers module, annotations like @ServiceConnection make integrating Testcontainers with your application context incredibly simple, dynamically configuring connection properties.
  • Container Lifecycle Management: @Testcontainers and @Container annotations provide robust control over starting and stopping containers for your test classes.
  • End-to-End Validation: You can write comprehensive tests that cover the entire flow of your microservice, from incoming HTTP requests to database persistence and outgoing message events.
  • Isolation and Repeatability: Each test run benefits from a fresh, clean container environment, eliminating test pollution and ensuring consistent results.

You now have a powerful pattern for building integration tests for Java Spring Boot microservices that will significantly boost your confidence in your application’s behavior.

What’s Next?

In the upcoming chapters, we’ll explore even more advanced Testcontainers topics, including:

  • Integrating Testcontainers into your CI/CD pipelines (GitHub Actions, GitLab CI).
  • Performance tuning and advanced container reuse strategies.
  • Testing complex multi-container application stacks.

Keep practicing, and happy coding!

References


This page is AI-assisted and reviewed. It references official documentation and recognized resources where relevant.