Fix: add multimodel models in chat api (#11986)

…tant, but model is available via UI

Fix: add multimodel models in chat api
Fixes #8549

### What problem does this PR solve?

Add a parameter model_type in chat api.


### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)

---------

Co-authored-by: writinwaters <93570324+writinwaters@users.noreply.github.com>
This commit is contained in:
Magicbook1108
2025-12-17 15:46:43 +08:00
committed by GitHub
parent 82d4e5fb87
commit 4fd4a41e7c
2 changed files with 16 additions and 1 deletions

View File

@ -2529,6 +2529,19 @@ curl --request POST \
The LLM settings for the chat assistant to create. If it is not explicitly set, a JSON object with the following values will be generated as the default. An `llm` JSON object contains the following attributes:
- `"model_name"`, `string`
The chat model name. If not set, the user's default chat model will be used.
:::caution WARNING
`model_type` is an *internal* parameter, serving solely as a temporary workaround for the current model-configuration design limitations.
Its main purpose is to let *multimodal* models (stored in the database as `"image2text"`) pass backend validation/dispatching. Be mindful that:
- Do *not* treat it as a stable public API.
- It is subject to change or removal in future releases.
:::
- `"model_type"`: `string`
A model type specifier. Only `"chat"` and `"image2text"` are recognized; any other inputs, or when omitted, are treated as `"chat"`.
- `"model_name"`, `string`
- `"temperature"`: `float`
Controls the randomness of the model's predictions. A lower temperature results in more conservative responses, while a higher temperature yields more creative and diverse responses. Defaults to `0.1`.
- `"top_p"`: `float`