Skip to content

Other models

To configure the use of other models these are the usual steps, with examples for Anthropic and Gemini:

  1. you need to create a Brevia project, see Setup for more details
  2. you need to install the LangChain library that provides the model you want to use, for Anthropic see here, for Gemini see here
    1. for Anthropic you can install the library with poetry add langchain-anthropic or pip install langchain-anthropic
    2. for Gemini you can install the library with poetry add langchain-google-genai or pip install langchain-google-genai
  3. you will probably need a token or API key from the service, made available as an environment variable, optionally using BREVIA_ENV_SECRETS configuration
    1. for Anthropic you will need the ANTHROPIC_API_KEY environment variable, for Gemini the GOOGLE_API_KEY environment variable
  4. you need to configure the desired model through a JSON object with key-value pairs as described in the Model Configuration formats section
    1. for Anthropic you can use anthropic as model_provider, have a look at the ChatAnthropic class for other chat model parameters - Anthropic does not provide embeddings for now
    2. for Gemini you can use google_genai as model_provider, have a look at the ChatGoogleGenerativeAI class for other chat model parameters; you can also use langchain_google_genai.embeddings.GoogleGenerativeAIEmbeddings for the embeddings engine - look at the provided links for more details on the additional configuration options