Fix: potential negative max_tokens in RAPTOR (#10701)

### What problem does this PR solve?

Fix potential negative max_tokens in RAPTOR. #10235.

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue
This commit is contained in:
Yongteng Lei
2025-10-21 15:49:51 +08:00
committed by GitHub
parent 544c9990e3
commit cd77425b87

View File

@ -114,7 +114,7 @@ class RecursiveAbstractiveProcessing4TreeOrganizedRetrieval:
),
}
],
{"max_tokens": self._max_token},
{"max_tokens": max(self._max_token, 512)}, # fix issue: #10235
)
cnt = re.sub(
"(······\n由于长度的原因,回答被截断了,要继续吗?|For the content length reason, it stopped, continue?)",