Add configuration to choose default llm models (#5245)

### What problem does this PR solve?

This pull request includes changes to the `api/settings.py` and
`docker/service_conf.yaml.template` files to add support for default
models in the LLM configuration (specially for LIGHTEN builds). The most
important changes include adding default model configurations and
updating the initialization settings to use these defaults.

For example:
With this configuration Bedrock will be enable by default with claude
and titan embeddings.

```
user_default_llm:
  factory: 'Bedrock'
  api_key: '{}' 
  base_url: ''
  default_models:
    chat_model: 'anthropic.claude-3-5-sonnet-20240620-v1:0'
    embedding_model: 'amazon.titan-embed-text-v2:0'
    rerank_model: ''
    asr_model: ''
    image2text_model: ''
```


### Type of change

- [X] New Feature (non-breaking change which adds functionality)

---------

Co-authored-by: Kevin Hu <kevinhu.sh@gmail.com>
This commit is contained in:
Omar Leonardo Sanchez Granados
2025-02-23 21:13:39 -05:00
committed by GitHub
parent 7ce675030b
commit a0b461a18e
2 changed files with 22 additions and 63 deletions

View File

@ -52,6 +52,12 @@ redis:
# factory: 'Tongyi-Qianwen'
# api_key: 'sk-xxxxxxxxxxxxx'
# base_url: ''
# default_models:
# chat_model: 'qwen-plus'
# embedding_model: 'BAAI/bge-large-zh-v1.5@BAAI'
# rerank_model: ''
# asr_model: ''
# image2text_model: ''
# oauth:
# github:
# client_id: xxxxxxxxxxxxxxxxxxxxxxxxx