Using Local LLMs with LM Studio

Introduction

BrainSoup supports the integration of local Large Language Models (LLMs) through LM Studio, an open-source application. This feature is especially beneficial for users prioritizing data privacy, security, or those looking to manage costs effectively by running models on their own hardware.

What is LM Studio?

LM Studio is a free versatile tool that allows you to download, manage, and run local LLMs on your machine. It provides a user-friendly interface for selecting, downloading, and activating models, as well as running inference locally. By integrating LM Studio with BrainSoup, you can leverage the power of advanced language models while maintaining full control over your data and infrastructure.

Step 1: Downloading and Installing LM Studio

  1. Visit the LM Studio official website to download the latest version of the application suitable for your operating system.
  2. Follow the installation instructions provided on the website to set up LM Studio on your machine.

Step 2: Downloading a Model with LM Studio

  1. From the Search tab in LM Studio, select the desired model from the list of available models and click on the Download button.
  2. Once the download is complete, the model will be listed in the Models dropdown menu, at the top of the screen.
  3. Select the downloaded model from the Models dropdown menu to activate it.

Tip: Some models are specifically optimized for certain domains or tasks, such as mathematics, programming, medical applications, role-playing, and more. By combining agents with different models, you can create your personalized team of experts.

Step 3: Enable the Local Inference Server in LM Studio

  1. In LM Studio, navigate to the Local Server tab.
  2. Choose a port number for the local server, or leave the default value (1234).
  3. Click on the Start Server button to activate the server.

Step 4: Integrating LM Studio with BrainSoup

  1. Navigate to the BrainSoup Settings.
  2. Add "lm-studio" as a Provider in BrainSoup, and enter the following URL in the Server URL field: http://localhost:1234 (replace 1234 with the port number you selected in LM Studio). Leave the API Key field empty.
  3. Add a model for the "lm-studio" provider. You can name it as "default", as LM Studio only supports one model at a time. Set a context window size suitable for your needs and supported by the model.

Step 5: Getting Started with Local LLMs in BrainSoup

The local model managed by LM Studio is now accessible within BrainSoup. You can select it for your agents in their respective settings. For this, follow these steps:

  1. Open the agent settings by double-clicking on the agent's name in the left pane.
  2. In the AI settings section, select the model named lm-studio/default from the dropdown list.

Conclusion

Integrating local LLMs via LM Studio offers unparalleled control over your data privacy and computational resources. With this setup, you're equipped to harness the capabilities of advanced language models while maintaining full ownership of your data and infrastructure.

Note: Most LM Studio LLMs don't support function calls and are not multimodal, but your agent can still use tools, see images and listen to audio thanks to BrainSoup's ability to delegate these abilities to a more powerful LLM when needed. This multi-LLM cooperation is the cornerstone of BrainSoup, allowing you to leverage the strengths of different models without being limited by their individual capabilities.