Other models
To configure the use of other models these are the usual steps, with examples for Google Gemini:
- you need to create a Brevia project, see Setup for more details
- you need to install the LangChain library that provides the model you want to use, for Gemini see here
- for Gemini you can install the library with
poetry add langchain-google-genaiorpip install langchain-google-genai
- for Gemini you can install the library with
- you will probably need a token or API key from the service, made available as an environment variable, optionally using
BREVIA_ENV_SECRETSconfiguration- for Gemini the
GOOGLE_API_KEYenvironment variable
- for Gemini the
- you need to configure the desired model through a JSON object with key-value pairs as described in the Model Configuration formats section
- for Gemini you can use
google_genaiasmodel_provider, have a look at theChatGoogleGenerativeAIclass for other chat model parameters; you can also uselangchain_google_genai.embeddings.GoogleGenerativeAIEmbeddingsfor the embeddings engine - look at the provided links for more details on the additional configuration options
- for Gemini you can use