mirror of
https://github.com/infiniflow/ragflow.git
synced 2025-12-24 07:26:47 +08:00
Add AI Badgr as OpenAI-compatible chat model provider (#12018)
## What problem does this PR solve? Adds AI Badgr as an optional LLM provider in RAGFlow. Users can use AI Badgr for chat completions and embeddings via its OpenAI-compatible API. **Background:** - AI Badgr provides OpenAI-compatible endpoints (`/v1/chat/completions`, `/v1/embeddings`, `/v1/models`) - Previously, RAGFlow didn't support AI Badgr - This PR adds support following the existing provider pattern (e.g., CometAPI, DeerAPI) **Implementation details:** - Added AI Badgr to the provider registry and configuration - Supports chat completions (via LiteLLMBase) and embeddings (via AIBadgrEmbed) - Uses standard API key authentication - Base URL: `https://aibadgr.com/api/v1` - Environment variables: `AIBADGR_API_KEY`, `AIBADGR_BASE_URL` (optional) ## Type of change - [x] New Feature (non-breaking change which adds functionality) This is a new feature that adds support for a new provider without changing existing functionality. --------- Co-authored-by: michaelmanley <55236695+michaelbrinkworth@users.noreply.github.com>
This commit is contained in:
@ -77,6 +77,19 @@ A complete list of models supported by RAGFlow, which will continue to expand.
|
||||
If your model is not listed here but has APIs compatible with those of OpenAI, click **OpenAI-API-Compatible** on the **Model providers** page to configure your model.
|
||||
:::
|
||||
|
||||
## Example: AI Badgr (OpenAI-compatible)
|
||||
|
||||
You can use **AI Badgr** with RAGFlow via the existing OpenAI-API-Compatible provider.
|
||||
|
||||
To configure AI Badgr:
|
||||
|
||||
- **Provider**: `OpenAI-API-Compatible`
|
||||
- **Base URL**: `https://aibadgr.com/api/v1`
|
||||
- **API Key**: your AI Badgr API key (from the AI Badgr dashboard)
|
||||
- **Model**: any AI Badgr chat or embedding model ID, as exposed by AI Badgr's OpenAI-compatible APIs
|
||||
|
||||
AI Badgr implements OpenAI-compatible endpoints for `/v1/chat/completions`, `/v1/embeddings`, and `/v1/models`, so no additional code changes in RAGFlow are required.
|
||||
|
||||
:::note
|
||||
The list of supported models is extracted from [this source](https://github.com/infiniflow/ragflow/blob/main/rag/llm/__init__.py) and may not be the most current. For the latest supported model list, please refer to the Python file.
|
||||
:::
|
||||
|
||||
Reference in New Issue
Block a user