How to use AI Endpoints and LangChain4j


Have a look at our previous blog post Enhance your applications with AI Endpoints

In the world of generative AI with LLMs, LangChain is one of the most famous Framework used to simplify the LLM use with API call.

LangChain’s tools and APIs simplify the process of building LLM-driven applications like chatbots and virtual agents.

LangChain  is designed to be used with Python language.

A similar Java project exists: LangChain4j.

⚠️ Despite the name, the LangChain4j project is not provided by the LangChain team. ⚠️

To make simplify the LangChain4j use, we’ll use it with Quarkus and specially the quarkus-langchain4j extension.

And, of course, we’ll use our AI Endpoints product to access to various LLM models 🤩.

ℹ️ All the code source used in the blog post is available on our GitHub repository: public-cloud-examples ℹ️

Quarkus project creation

First of all, you need to create a Quarkus application threw de Quarkus CLI.

$ quarkus create app com.ovhcloud.examples.aiendpoints:quarkus-langchain4j \
                   --extension='quarkus-langchain4j-mistral-ai'

Here is the tree structure after running the previous command:

.
├── .dockerignore
├── .gitignore
├── .mvn
│   └── wrapper
│       ├── .gitignore
│       ├── MavenWrapperDownloader.java
│       └── maven-wrapper.properties
├── README.md
├── mvnw
├── mvnw.cmd
├── pom.xml
└── src
    └── main
        ├── docker
        │   ├── Dockerfile.jvm
        │   ├── Dockerfile.legacy-jar
        │   ├── Dockerfile.native
        │   └── Dockerfile.native-micro
        ├── java
        └── resources
            └── application.properties

Have a look to your pom.xml to see the langchain4j-quarkus extension:

<!-- ... -->
 
<dependency>
  <groupId>io.quarkiverse.langchain4j</groupId>
  <artifactId>quarkus-langchain4j-mistral-ai</artifactId>
  <version>0.10.4</version>
</dependency>
 
<!-- ... -->

AI Service creation

LLM is used remotely via a LangChain4j AI Service.

Let’s code our service to create a chat bot.

import io.quarkiverse.langchain4j.RegisterAiService;
 
@RegisterAiService
public interface ChatBotService {
   
}

You are ready to customize your prompt!

import dev.langchain4j.service.SystemMessage;
import dev.langchain4j.service.UserMessage;
import io.quarkiverse.langchain4j.RegisterAiService;
 
@RegisterAiService
public interface ChatBotService {
  // Scope / context passed to the LLM
  @SystemMessage("You are a virtual, an AI assistant.")
  // Prompt (with detailed instructions and variable section) passed to the LLM
  @UserMessage("Answer as best possible to the following question: {question}. The answer must be in a style of a virtual assistant and add some emojis to make the answer more fun.")
  String askAQuestion(String question);
}

If you want more information about the purpose of SystemMessage and UserMessage you can read the documentation of the Quarkus extension.

Then, you have to configure the quarkus-langchain4j extension to use AI Endpoints.

### Global configurations
# Base URL for Mistral AI endpoints
quarkus.langchain4j.mistralai.base-url=https://mistral-7b-instruct-v02.endpoints.kepler.ai.cloud.ovh.net/api/openai_compat/v1
# Activate or not the log during the request
quarkus.langchain4j.mistralai.log-requests=true
# Activate or not the log during the response
quarkus.langchain4j.mistralai.log-responses=true
# Delay before raising a timeout exception                   
quarkus.langchain4j.mistralai.timeout=60s   
# No key is needed
quarkus.langchain4j.mistralai.api-key=foo
 
# Activate or not the Mistral AI embedding model                     
quarkus.langchain4j.mistralai.embedding-model.enabled=false
 
### Chat model configurations
# Activate or not the Mistral AI chat model
quarkus.langchain4j.mistralai.chat-model.enabled=true             
# Chat model name used
quarkus.langchain4j.mistralai.chat-model.model-name=Mistral-7B-Instruct-v0.2
# Number of tokens to use
quarkus.langchain4j.mistralai.chat-model.max-tokens=1024

ℹ️  To know how to use the OVHcloud AI Endpoints product, please read the previous blog post: Enhance your applications with AI Endpoints ℹ️ 

AI chat bot API

Now it’s time to test our AI!

First of all, add the Quarkus rest extension.

$ quarkus ext add io.quarkus:quarkus-rest
 
Looking for the newly published extensions in registry.quarkus.io
[SUCCESS] ✅  Extension io.quarkus:quarkus-rest has been installed

Let’s develop a small API.

import com.ovhcloud.examples.aiendpoints.services.ChatBotService;
import jakarta.inject.Inject;
import jakarta.ws.rs.POST;
import jakarta.ws.rs.Path;
 
// Endpoint root path
@Path("ovhcloud-ai")    
public class AIEndpointsResource {
  // AI Service injection to use it later
  @Inject                                            
  ChatBotService chatBotService;
 
  // ask resource exposition with POST method
  @Path("ask")                                       
  @POST                                              
  public String ask(String question) {
    // Call the Mistral AI chat model
    return  chatBotService.askAQuestion(question);   
  }
}

And now it’s time to test the AI chat bot API!

To run your API, just use the Quarkus dev mode.

$ quarkus dev
 
 --/ __ \/ / / / _ | / _ \/ //_/ / / / __/
 -/ /_/ / /_/ / __ |/ , _/ ,< / /_/ /\ \  
--\___\_\____/_/ |_/_/|_/_/|_|\____/___/  
2024-04-12 08:51:33,515 INFO  [io.quarkus] (Quarkus Main Thread) quarkus-langchain4j 1.0.0-SNAPSHOT on JVM (powered by Quarkus 3.9.3) started in 3.163s. Listening on: http://localhost:8080
 
2024-04-12 08:51:33,517 INFO  [io.quarkus] (Quarkus Main Thread) Profile dev activated. Live Coding activated.
2024-04-12 08:51:33,518 INFO  [io.quarkus] (Quarkus Main Thread) Installed features: [awt, cdi, langchain4j, langchain4j-mistralai, poi, qute, rest, rest-client, rest-client-jackson, smallrye-context-propagation, vertx]
 
--
Tests paused
Press [e] to edit command line args (currently ''), [r] to resume testing, [o] Toggle test output, [:] for the terminal, [h] for more options>

Call the API with a curl command.

$ curl --header "Content-Type: application/json" \
  --request POST \
  --data '{"question": "What is OVHcloud?"}' \
  http://localhost:8080/ovhcloud-ai/ask
 
Answer: «OVHcloud is a global, integrated cloud hosting platform, offering Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). »
 
This answer describes OVHcloud as a global integrated cloud hosting platform offering IaaS and PaaS with optimal and secure services. Emojis were added to the response to make it more fun and engaging.


And that it!

In just a few lines of code, you’ve developed your first chat bot using Quarkus, LangChain4j and AI Endpoints.

In the next blog posts we’ll deep dive in the use of LLMs threw AI Endpoints!

Don’t hesitate to test our new product, AI Endpoints, and give us your feedback.

You have a dedicated Discord channel (#ai-endpoints) on our Discord server (https://discord.gg/ovhcloud), see you there!

Website | + posts

Once a developer, always a developer!
Java developer for many years, I have the joy of knowing JDK 1.1, JEE, Struts, ... and now Spring, Quarkus, (core, boot, batch), Angular, Groovy, Golang, ...
For more than ten years I was a Software Architect, a job that allowed me to face many problems inherent to the complex information systems in large groups.
I also had other lives, notably in automation and delivery with the implementation of CI/CD chains based on Jenkins pipelines.
I particularly like sharing and relationships with developers and I became a Developer Relation at OVHcloud.
This new adventure allows me to continue to use technologies that I like such as Kubernetes or AI for example but also to continue to learn and discover a lot of new things.
All the while keeping in mind one of my main motivations as a Developer Relation: making developers happy.
Always sharing, I am the co-creator of the TADx Meetup in Tours, allowing discovery and sharing around different tech topics.