Adding the Minimax model (#1009)

### What problem does this PR solve?

Added support for MiniMax LLM

### Type of change

- [x] New Feature (non-breaking change which adds functionality)

---------

Co-authored-by: cecilia-uu <konghui1996@163.com>
This commit is contained in:
cecilia-uu
2024-05-31 16:38:53 +08:00
committed by GitHub
parent 5d2f7136dd
commit 260c68f60c
3 changed files with 58 additions and 1 deletions

View File

@ -464,3 +464,11 @@ class VolcEngineChat(Base):
except Exception as e:
yield ans + "\n**ERROR**: " + str(e)
yield tk_count
class MiniMaxChat(Base):
def __init__(self, key, model_name="abab6.5s-chat",
base_url="https://api.minimax.chat/v1/text/chatcompletion_v2"):
if not base_url:
base_url="https://api.minimax.chat/v1/text/chatcompletion_v2"
super().__init__(key, model_name, base_url)