Unit testing is an essential part of software development, ensuring that individual components of an application function as expected. However, writing unit tests can be time-consuming and tedious. In this article, we will explore how to integrate an Ollama Large Language Model (LLM) locally with Spring Boot to generate unit tests automatically for Java applications.

This guide will cover:

  • Setting up Ollama LLM locally
  • Integrating it with a Spring Boot application
  • Generating unit tests for Java classes using Ollama
  • Running and validating the generated tests

By the end of this article, you will have a working implementation of AI-generated unit tests for your Java applications.

1. Setting Up Ollama LLM Locally

Ollama is an open-source large language model framework that can be deployed locally to provide AI capabilities without relying on cloud-based services. Before integrating it with Spring Boot, we need to set it up.

Step 1: Install Ollama

Download and install Ollama from its official website:

https://ollama.ai/download

Follow the installation instructions based on your operating system (Windows, macOS, or Linux).

Step 2: Verify the Installation

After installation, verify that Ollama is running by executing the following command in your terminal:

ollama run llama2

This should start a local instance of the LLM, which we will interact with via our Spring Boot application.

2. Creating a Spring Boot Application

Now that we have Ollama running locally, let’s create a Spring Boot application to interact with it.

Step 1: Generate a Spring Boot Project

You can generate a Spring Boot project using Spring Initializr with the following dependencies:

  • Spring Web
  • Spring Boot DevTools
  • Lombok

Extract the project and open it in your favorite IDE (IntelliJ IDEA, Eclipse, or VS Code).

Step 2: Define a Controller to Communicate with Ollama

Create a REST controller that will send requests to the locally running Ollama instance:

@RestController
@RequestMapping("/api/ollama")
public class OllamaController {
    
    private final RestTemplate restTemplate;

    public OllamaController(RestTemplateBuilder restTemplateBuilder) {
        this.restTemplate = restTemplateBuilder.build();
    }

    @PostMapping("/generate-test")
    public ResponseEntity<String> generateTest(@RequestBody String javaCode) {
        String ollamaEndpoint = "http://localhost:11434/api/generate"; // Assuming Ollama runs locally on port 11434
        
        HttpHeaders headers = new HttpHeaders();
        headers.setContentType(MediaType.APPLICATION_JSON);

        String prompt = "Generate JUnit tests for the following Java class: \n" + javaCode;
        
        HttpEntity<String> request = new HttpEntity<>(prompt, headers);
        ResponseEntity<String> response = restTemplate.postForEntity(ollamaEndpoint, request, String.class);

        return response;
    }
}

Here, we send a Java class as input to Ollama, requesting it to generate JUnit test cases for that class.

3. Generating Unit Tests Using Ollama

Let’s define a sample Java class for which we want to generate unit tests.

Sample Java Class: Calculator

public class Calculator {
    public int add(int a, int b) {
        return a + b;
    }
    
    public int subtract(int a, int b) {
        return a - b;
    }
}

Now, let’s send this class to Ollama through our API endpoint.

Request Example (cURL Command)

curl -X POST "http://localhost:8080/api/ollama/generate-test" \
     -H "Content-Type: application/json" \
     -d "public class Calculator { public int add(int a, int b) { return a + b; } public int subtract(int a, int b) { return a - b; } }"

Expected Response

Ollama should return JUnit test cases like the following:

import static org.junit.jupiter.api.Assertions.*;
import org.junit.jupiter.api.Test;

public class CalculatorTest {

    private final Calculator calculator = new Calculator();

    @Test
    void testAdd() {
        assertEquals(5, calculator.add(2, 3));
    }

    @Test
    void testSubtract() {
        assertEquals(1, calculator.subtract(3, 2));
    }
}

4. Running and Validating the Generated Tests

Once we receive the generated test cases, we can add them to our test directory and run them using JUnit.

Running JUnit Tests

If you’re using Maven, execute the following command:

mvn test

For Gradle, use:

gradle test

If everything is correctly configured, the test results should confirm that our Calculator methods work as expected.

Conclusion

In this article, we demonstrated how to leverage Ollama LLM locally to generate JUnit tests for Java applications using Spring Boot. By setting up an API endpoint, we automated the process of sending Java classes to Ollama and receiving test cases in return. This approach can significantly reduce the time spent on writing unit tests, allowing developers to focus on core business logic.

The benefits of this approach include:

  • Time efficiency: Automated test generation saves time.
  • Improved code coverage: Ensuring that important functionalities are tested.
  • Easy integration: The solution seamlessly integrates with existing Java projects.

However, it’s essential to review AI-generated test cases for accuracy and completeness. As LLMs continue to evolve, their ability to understand complex logic and edge cases will improve, making automated testing even more effective. By following this guide, you can integrate Ollama into your development workflow and enhance your software quality with minimal effort.