Skip to content

Other models

To configure the use of other models these are the usual steps, with examples for Anthropic and Gemini:

  1. you need to create a Brevia project, see Setup for more details
  2. you need to install the LangChain library that provides the model you want to use, for Anthropic see here, for Gemini see here
    1. for Anthropic you can install the library with poetry add langchain-anthropic or pip install langchain-anthropic
    2. for Gemini you can install the library with poetry add langchain-google-genai or pip install langchain-google-genai
  3. you will probably need a token or API key from the service, made available as an environment variable, optionally using BREVIA_ENV_SECRETS configuration
    1. for Anthropic you will need the ANTHROPIC_API_KEY environment variable, for Gemini the GOOGLE_API_KEY environment variable
  4. you need to configure the desired model through a JSON object with key-value pairs, including _type with the fully qualified name of the LangChain integration class
    1. for Anthropic you can use langchain_anthropic.chat_models.ChatAnthropic for chat models - Anthropic does not provide embeddings for now
    2. for Gemini you can use langchain_google_genai.chat_models.ChatGoogleGenerativeAI for chat models or langchain_google_genai.embeddings.GoogleGenerativeAIEmbeddings for embeddings engine - look at the provided links for more details on the additional configuration options