Fix too long context issue. (#4735)

### What problem does this PR solve?

#4728

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
This commit is contained in:
Kevin Hu
2025-02-06 11:37:23 +08:00
committed by GitHub
parent a3a70431f3
commit 2a07eb69a7
4 changed files with 6 additions and 3 deletions

View File

@ -91,7 +91,7 @@ class GraphExtractor(Extractor):
).format(**self._context_base, input_text=content)
try:
gen_conf = {"temperature": 0.3}
gen_conf = {"temperature": 0.8}
final_result = self._chat(hint_prompt, [{"role": "user", "content": "Output:"}], gen_conf)
token_count += num_tokens_from_string(hint_prompt + final_result)
history = pack_user_ass_to_openai_messages(hint_prompt, final_result)