Integrating Groq LLMs into BrainSoup

Introduction

Groq is a company that has developed a revolutionary Language Processing Unit (LPU) Inference Engine that is capable of running large language models at an unprecedented speed. In this tutorial, we will guide you through the process of integrating Groq's Large Language Models (LLMs) into BrainSoup to get access to the most advanced open-source models available, for free (with some limitations), and with the fastest inference speed in the industry.

Note: For more general information on integrating third-party LLMs into BrainSoup, please refer to our help article.

Step 1: Create Your Groq Account

First, navigate to Groq's Console and sign up for an account. This will be your gateway to accessing Groq's powerful LLMs. By default, Groq offers a free tier that allows you to use the models with some rate limitations.

Step 2: Generate Your API Key

Once your account is set up, go to API Keys section on the console and click on "Create API Key". Keep this key safe; it is your access pass to integrating Groq LLMs into BrainSoup.

Step 3: Navigate to the BrainSoup Settings

Go to the Settings screen in BrainSoup and locate the AI providers section.

Step 4: Add Groq as a Provider in BrainSoup

Click on the Add provider button (+) and enter the following API endpoint: https://api.groq.com/openai . This step establishes the connection between BrainSoup and Groq's services. You can name the provider as "groq" for easy identification.

Step 5: Select Your Preferred Models

With Groq, you have access to a variety of models tailored to different needs. Visit Groq's Model Documentation to explore available models.

As of 24th April 2024, the following models are available:

  • llama3-8b-8192: with a context window of 8192 tokens.
  • llama3-70b-8192: with a context window of 8192 tokens.
  • llama2-70b-4096: with a context window of 4096 tokens.
  • mixtral-8x7b-32768: with a context window of 32768 tokens.
  • gemma-7b-it: with a context window of 8192 tokens.

Step 6: Integrate Selected Models into BrainSoup

Finally, add these models into BrainSoup by clicking on the Add model button (+) next to the Groq provider. Fill in the details for each model, specifying their Model ID and Max context window size as per the list provided. This action makes the selected models available for use in the settings of your BrainSoup agents.

Conclusion

Integrating Groq LLMs into BrainSoup opens up a realm of possibilities for users seeking advanced AI capabilities without compromising on speed or efficiency. By following these straightforward steps, you can easily tap into the power of Groq’s LPU-driven models, making your AI tasks faster and more responsive than ever before. Embrace this integration to elevate your productivity levels and push the boundaries of what you can achieve with AI in BrainSoup.