Artificial Intelligence is transforming how we build modern software applications. Whether it’s intelligent chatbots, document summarizers, or contextual assistants, integrating AI capabilities directly into Java-based backends is now easier than ever. Traditionally, integrating AI into Java required complex external API calls or Python bridges, but modern frameworks like Quarkus and LangChain4j have simplified this process significantly.
In this article, we will explore how to build AI-infused Java applications using Quarkus and LangChain4j, step by step. We will set up a Quarkus project, add LangChain4j, integrate a language model (like OpenAI or Ollama), and implement a few AI-driven features—all with clean, idiomatic Java code.
Understanding Quarkus and LangChain4j
Before diving into the code, let’s briefly understand what these two powerful tools bring to the table.
Quarkus is a modern, cloud-native Java framework designed for GraalVM and OpenJDK HotSpot. It’s known for its lightning-fast startup times, low memory footprint, and seamless developer productivity features. It’s an excellent choice for building microservices or serverless applications.
LangChain4j, on the other hand, is a Java library that brings LangChain-style AI orchestration to the JVM ecosystem. Inspired by the Python-based LangChain framework, LangChain4j allows developers to easily:
-
Connect to LLMs (Large Language Models)
-
Build prompt templates
-
Manage chat memory
-
Create agents and tools
-
Process documents using embeddings and vector stores
When used together, Quarkus + LangChain4j enables developers to build robust, scalable, and AI-powered Java microservices that can handle complex NLP (Natural Language Processing) tasks natively.
Setting Up Your Quarkus Project
To begin, we’ll set up a new Quarkus project. You can generate one quickly using the Quarkus CLI or Maven archetype.
Once generated, navigate into your project folder:
Open the project in your favorite IDE (IntelliJ IDEA, VS Code, or Eclipse).
Adding LangChain4j Dependencies
Next, we need to include the LangChain4j library and a compatible LLM provider. You can choose between multiple providers—OpenAI, Azure OpenAI, or Ollama (for local models like Llama or Mistral).
Add the following dependencies to your pom.xml
:
After adding these dependencies, reload your Maven project to download and integrate them.
Configuring Your LLM Provider
Next, we’ll configure LangChain4j to connect to a language model. You can either use OpenAI’s API or run Ollama locally.
For example, if you’re using OpenAI, add your API key in application.properties
:
If you’re using Ollama (a local LLM runtime), simply ensure you have Ollama installed and a model like llama3
available:
Creating an AI-Powered Service in Quarkus
Now, let’s create a simple service that communicates with the LLM.
Inside src/main/java/com/example/
, create a new file called AiService.java
.
This interface uses the LangChain4j service annotation model, which automatically creates a proxy implementation that interacts with the underlying LLM. The @SystemMessage
defines the AI’s behavior and tone, ensuring consistent responses.
Exposing an AI REST Endpoint
Next, let’s expose our AI service as a REST endpoint using Quarkus’s RESTEasy Reactive.
Edit the existing AiResource.java
file (created during project generation):
Now you can send a POST
request to your endpoint using a JSON payload:
And you should get an intelligent AI-generated response!
Adding Context Memory to Your Chat
LangChain4j supports conversational memory, enabling the AI to remember previous exchanges. This is essential for creating chatbots or virtual assistants that maintain context.
Let’s extend our service to include memory.
This setup maintains the last 10 exchanges, allowing continuity in conversation. Replace the AiService
reference in your REST resource with this new MemoryAiService
for persistent dialogue behavior.
Building a Document Summarizer
Beyond chat, let’s demonstrate a more specialized AI use case — summarizing documents. LangChain4j includes utilities for text chunking and summarization.
Create a new file, DocumentSummarizer.java
:
You can then expose this summarizer as a REST endpoint:
This endpoint can be used for summarizing articles, user inputs, or even large internal documents.
Deploying and Optimizing Your Application
Quarkus provides multiple ways to deploy your app:
-
JVM Mode for standard deployments
-
Native Mode for instant startup and minimal memory usage
You can build a native image using GraalVM with a single command:
The resulting binary starts in milliseconds and is ideal for cloud environments like Kubernetes or serverless platforms.
To further optimize, you can:
-
Use Quarkus Dev Mode for hot reload:
mvn quarkus:dev
-
Configure AI caching (e.g., store embeddings or responses in Redis)
-
Integrate vector stores for retrieval-augmented generation (RAG)
-
Secure endpoints with Quarkus Security extensions
Testing the AI Endpoints
You can write integration tests using Quarkus’s built-in test framework:
Run your tests using:
Conclusion
Integrating AI into Java applications no longer requires complex inter-language bridges or external orchestration. With Quarkus and LangChain4j, Java developers can natively build intelligent, responsive, and context-aware systems using familiar programming patterns.
Quarkus brings performance, scalability, and developer productivity, while LangChain4j provides the AI brain—abstracting complex interactions with LLMs into elegant Java interfaces. Together, they enable:
-
Rapid prototyping of AI-enhanced microservices
-
Real-time conversational applications with memory
-
Context-driven summarization and document processing
-
On-prem or cloud-based model flexibility (via OpenAI, Ollama, etc.)
By adopting this stack, you can confidently build AI-infused enterprise-grade applications that maintain Java’s reliability and ecosystem maturity, while embracing the intelligence and flexibility of modern language models.
Whether you’re creating customer support assistants, intelligent report generators, or adaptive knowledge systems, Quarkus + LangChain4j forms a future-ready foundation for innovation.