How to serve LLMs with vLLM and OVHcloud AI Deploy
In this tutorial, we will learn how to serve Large Language Models (LLMs) using vLLM and the OVHcloud AI Products.
How to serve LLMs with vLLM and OVHcloud AI Deploy Read More »
In this tutorial, we will learn how to serve Large Language Models (LLMs) using vLLM and the OVHcloud AI Products.
How to serve LLMs with vLLM and OVHcloud AI Deploy Read More »
Have a look at our previous blog posts: – Enhance your applications with AI Endpoints – How to use AI Endpoints and LangChain4j – LLMs streaming with AI Endpoints and LangChain4j In the world of generative AI with LLMs, LangChain is one of the most famous Framework used to simplify the LLM use with API
How to use AI Endpoints and LangChain to create a chatbot Read More »
Have a look at our previous blog posts: – Enhance your applications with AI Endpoints – How to use AI Endpoints and LangChain4j After explaining how to use AI Endpoints and LangChain4j in the previous post, let’s take a look at how to use streaming to create a real chat bot ! Create the project
LLMs streaming with AI Endpoints and LangChain4j Read More »
Have a look at our previous blog post Enhance your applications with AI Endpoints In the world of generative AI with LLMs, LangChain is one of the most famous Framework used to simplify the LLM use with API call. LangChain’s tools and APIs simplify the process of building LLM-driven applications like chatbots and virtual agents.
How to use AI Endpoints and LangChain4j Read More »
Recently, OVHcloud launch AI Endpoints in its alpha phase. What’s AI Endpoints? AI Endpoints is a serverless solution providing AI APIs. The platform is designed to be simple and intuitive, enabling developers, even without AI expertise, to easily use pre-trained and optimized AI models. From today you can use LLM as a Service! You choose
Enhance your applications with AI Endpoints Read More »