Update Groq AI Model Config (#7335)

With current config will get error "Fail to access model(gemma-7b-it)
using this api key"
Since the model has been removed, according to Groq official document:
https://console.groq.com/docs/models

### Type of change

- [ x] Bug Fix (non-breaking change which fixes an issue)
This commit is contained in:
Jason Li
2025-04-27 11:05:25 +02:00
committed by GitHub
parent 6a45d93005
commit 67b087019c

View File

@ -929,12 +929,6 @@
"tags": "LLM",
"status": "1",
"llm": [
{
"llm_name": "gemma-7b-it",
"tags": "LLM,CHAT,15k",
"max_tokens": 8192,
"model_type": "chat"
},
{
"llm_name": "gemma2-9b-it",
"tags": "LLM,CHAT,15k",