Other models
To configure the use of other models these are the usual steps, with examples for Anthropic and Gemini:
- you need to create a Brevia project, see Setup for more details
- you need to install the LangChain library that provides the model you want to use, for Anthropic see here, for Gemini see here
- for Anthropic you can install the library with
poetry add langchain-anthropic
orpip install langchain-anthropic
- for Gemini you can install the library with
poetry add langchain-google-genai
orpip install langchain-google-genai
- for Anthropic you can install the library with
- you will probably need a token or API key from the service, made available as an environment variable, optionally using
BREVIA_ENV_SECRETS
configuration- for Anthropic you will need the
ANTHROPIC_API_KEY
environment variable, for Gemini theGOOGLE_API_KEY
environment variable
- for Anthropic you will need the
- you need to configure the desired model through a JSON object with key-value pairs as described in the Model Configuration formats section
- for Anthropic you can use
anthropic
asmodel_provider
, have a look at theChatAnthropic
class for other chat model parameters - Anthropic does not provide embeddings for now - for Gemini you can use
google_genai
asmodel_provider
, have a look at theChatGoogleGenerativeAI
class for other chat model parameters; you can also uselangchain_google_genai.embeddings.GoogleGenerativeAIEmbeddings
for the embeddings engine - look at the provided links for more details on the additional configuration options
- for Anthropic you can use