Create a code assistant with Continue and AI Endpoints

An IA help coders to develop

If you want to have more information on AI Endpoints, please read the following blog post.
You can, also, have a look at our previous blog posts on how use AI Endpoints.
Also, be sure to check out this blog post about deploying Coder on the OVHcloud Managed Kubernetes Service.

In the field of the code assistant a lot of solutions exist.

However, you should want to create your own assistant to master your configuration and to have a better control on your data.

Continue will help you in this task.

In a nutshell, Continue is an IDE plugin to build your own code assistant.

At the time of writing this blog post, Continue is compatible with VSCode and JetBrains IDE.

The great strength of Continue is that it allows to use a custom LLM endpoints and, yes, AI Endpoints works with Continue 😎.

Continue installation

Continue come as a plugin, it’s very simple to install, please follow the official documentation.

Note that when you install Continue for one IDE, the settings are shared for the other IDE.

Continue settings with AI Endpoints

Once continue is installed, you can set it up with AI Endpoints.

Continue has two major configurations, one for the chatbot tool and another one for the tab completion tool.

To add the configurations, open the JSON configuration file and set it up as follow:

{
  "tabAutocompleteModel": {
    "title": "Mamba-Codestral-7b completion",
    "model": "mamba-codestral-7B-v0.1",
    "apiBase": "https://mamba-codestral-7b-v0-1.endpoints.kepler.ai.cloud.ovh.net/api/openai_compat/v1",
    "provider": "openai",
    "useLegacyCompletionsEndpoint": true,
    "apiKey": "<your API key>"
  },
  "models": [
    {
      "title": "LLaMa 3.1 70B",
      "model": "Meta-Llama-3_1-70B-Instruct",
      "apiBase": "https://llama-3-1-70b-instruct.endpoints.kepler.ai.cloud.ovh.net/api/openai_compat/v1",
      "provider": "openai",
      "useLegacyCompletionsEndpoint": false,
      "apiKey": "<your API key>"
    },
    {
      "title": "CodeLlama-13b",
      "model": "CodeLlama-13b-Instruct-hf",
      "apiBase": "https://codellama-13b-instruct-hf.endpoints.kepler.ai.cloud.ovh.net/api/openai_compat/v1",
      "provider": "openai",
      "useLegacyCompletionsEndpoint": false,
      "apiKey": "<your API key>"
    },
    {
      "title": "Codestral",
      "model": "mamba-codestral-7B-v0.1",
      "apiBase": "https://mamba-codestral-7b-v0-1.endpoints.kepler.ai.cloud.ovh.net/api/openai_compat/v1",
      "provider": "openai",
      "useLegacyCompletionsEndpoint": false,
      "apiKey": "<your API key>"
    }
  ] 
  // ... 
}

As you see you can define only one model for the tab completion feature, you can choose CodeLlama or Codestral model from AI Endpoints.

If you want to use CodeLlama here is the configuration:

{
  "tabAutocompleteModel": {
    "title": "CodeLlama-13b completion",
    "model": "CodeLlama-13b-Instruct-hf",
    "apiBase": "https://codellama-13b-instruct-hf.endpoints.kepler.ai.cloud.ovh.net/api/openai_compat/v1",
    "provider": "openai",
    "useLegacyCompletionsEndpoint": true,
    "apiKey": "<your API key>"
  },
// ...

For the chatbot tool you can set several models, try them and choose the one that fit the best to your needs 😊.

It’s time to test it!

The chatbot tool.

The tab completion tool.

Don’t hesitate to test our new product, AI Endpoints, and give us your feedback.

You have a dedicated Discord channel (#ai-endpoints) on our Discord server (https://discord.gg/ovhcloud), see you there!

Website | + posts

Once a developer, always a developer!
Java developer for many years, I have the joy of knowing JDK 1.1, JEE, Struts, ... and now Spring, Quarkus, (core, boot, batch), Angular, Groovy, Golang, ...
For more than ten years I was a Software Architect, a job that allowed me to face many problems inherent to the complex information systems in large groups.
I also had other lives, notably in automation and delivery with the implementation of CI/CD chains based on Jenkins pipelines.
I particularly like sharing and relationships with developers and I became a Developer Relation at OVHcloud.
This new adventure allows me to continue to use technologies that I like such as Kubernetes or AI for example but also to continue to learn and discover a lot of new things.
All the while keeping in mind one of my main motivations as a Developer Relation: making developers happy.
Always sharing, I am the co-creator of the TADx Meetup in Tours, allowing discovery and sharing around different tech topics.