Other models
To configure the use of other models these are the usual steps, with examples for Google Gemini:
- you need to create a Brevia project, see Setup for more details
- you need to install the LangChain library that provides the model you want to use, for Gemini see here
- for Gemini you can install the library with
poetry add langchain-google-genai
orpip install langchain-google-genai
- for Gemini you can install the library with
- you will probably need a token or API key from the service, made available as an environment variable, optionally using
BREVIA_ENV_SECRETS
configuration- for Gemini the
GOOGLE_API_KEY
environment variable
- for Gemini the
- you need to configure the desired model through a JSON object with key-value pairs as described in the Model Configuration formats section
- for Gemini you can use
google_genai
asmodel_provider
, have a look at theChatGoogleGenerativeAI
class for other chat model parameters; you can also uselangchain_google_genai.embeddings.GoogleGenerativeAIEmbeddings
for the embeddings engine - look at the provided links for more details on the additional configuration options
- for Gemini you can use