Integrating Third-Party LLMs into BrainSoup


In the rapidly evolving landscape of artificial intelligence, Large Language Models (LLMs) are at the forefront of enabling sophisticated natural language processing capabilities. BrainSoup, our versatile AI collaboration platform, extends its functionality by supporting integration with third-party LLMs. This feature empowers users to leverage the unique strengths of various AI models available in the market, ensuring that BrainSoup remains adaptable to your needs.

Why Integrate Third-Party LLMs?

Integrating third-party LLMs into BrainSoup allows users to:

  • Access a broader range of AI capabilities beyond the built-in models.
  • Customize the AI experience according to specific project requirements or preferences.
  • Take advantage of the latest advancements in AI without waiting for platform updates.

How to Add External LLMs to BrainSoup

Adding an external LLM to BrainSoup involves a few straightforward steps:

Step1: Signing Up with an AI Provider

Choose an AI provider (e.g., Hugging Face, Groq, OpenRouter) that offers an API compatible with the OpenAI API standard. Create an account and obtain your API key.

Tip: Some providers may offer free tiers or trial periods for testing.

Step2: Obtaining the API URL

Refer to your chosen provider's documentation to find the API endpoint URL. Some providers may offer multiple endpoints for different protocols, so ensure you select the OpenAI-compatible one.

Step 3: Navigating to BrainSoup Settings

Navigate to the AI providers section within BrainSoup's Settings screen.

Step 4: Adding a New Provider

Click on the Add provider button (+). Enter the API URL and your API key in the respective fields.

Step5: Registering a LLM

To add a model, click on the Add model button (+), next the the newly added provider. Input the model ID as recognized by your provider and fill in details such as Maximum context window size and Maximum output size based on the provider's guidance. Refer to the provider's documentation for model-specific details.

Step 6: Adding More LLMs

If you wish to integrate more than one model from the provider, repeat step 5 for each model.

Step 7: Using the External LLM

Your external LLM is now integrated and available for selection under the Language model parameter within any agent's configuration settings.


By embracing third-party LLMs, BrainSoup users can significantly enhance their AI-driven projects with minimal effort. This flexibility not only ensures that BrainSoup adapts to changing technological landscapes but also empowers users with choice and customization options for their AI needs. Whether it's tapping into specialized models for unique tasks or leveraging cost-effective solutions, integrating external LLMs opens up new possibilities for innovation within BrainSoup.