fix: leading whitespace is missing in the generated text when using stop_words (#5511)

* bug fix

* add release note

* Update releasenotes/notes/fix-stop-words-strip-issue-22ce51306e7b91e4.yaml

Co-authored-by: Stefano Fiorucci <44616784+anakin87@users.noreply.github.com>

* Update releasenotes/notes/fix-stop-words-strip-issue-22ce51306e7b91e4.yaml

Co-authored-by: Stefano Fiorucci <44616784+anakin87@users.noreply.github.com>

---------

Co-authored-by: Stefano Fiorucci <44616784+anakin87@users.noreply.github.com>
This commit is contained in:
Fanli Lin 2023-08-04 23:40:19 +08:00 committed by GitHub
parent abc6737e63
commit 4496fc6afd
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
2 changed files with 5 additions and 1 deletions

View File

@ -280,7 +280,7 @@ class HFLocalInvocationLayer(PromptModelInvocationLayer):
# We want to exclude it to be consistent with other invocation layers
for idx, _ in enumerate(generated_texts):
for stop_word in stop_words:
generated_texts[idx] = generated_texts[idx].replace(stop_word, "").strip()
generated_texts[idx] = generated_texts[idx].replace(stop_word, "").rstrip()
return generated_texts
def _ensure_token_limit(self, prompt: Union[str, List[Dict[str, str]]]) -> Union[str, List[Dict[str, str]]]:

View File

@ -0,0 +1,4 @@
---
fixes:
- |
Ensure the leading whitespace in the generated text is preserved when using `stop_words` in the Hugging Face invocation layer of the PromptNode.