Can BrainSoup use local LLMs?

Yes, there are two ways to use local Large Language Models (LLMs) in BrainSoup:

  • Ollama integration : BrainSoup supports the integration of local Large Language Models (LLMs) through Ollama, an open-source application. For this, you need to have an Ollama instance running on your local machine (or on another machine accessible from your local network). If your Ollama instance is on the same machine as BrainSoup, it will automatically detect it. Otherwise, you need to provide the IP address and port of the Ollama instance in the AI providers section of the BrainSoup's Settings screen. Once connected, you can select the local LLM when creating or editing an agent.
  • Third-party LLM integration: BrainSoup can connect to any AI provider that offers an OpenAI-compatible API. You can use this feature to connect to programs such as LM Studio or Jan, which allow you to download, manage, and run local LLMs on your machine.

Note: Agents based on local LLMs do not consume BrainSoup credits, making it a secure and cost-effective solution for your AI needs.