Skip to content

Commit 41f7466

Browse files
committed
Merge branch 'feature/chat-with-document' into 'develop'
adding prompt copy so responses come back in plain text See merge request genaiic-reusable-assets/engagement-artifacts/genaiic-idp-accelerator!366
2 parents 3d2c341 + fa73690 commit 41f7466

File tree

1 file changed

+1
-1
lines changed
  • src/lambda/chat_with_document_resolver

1 file changed

+1
-1
lines changed

src/lambda/chat_with_document_resolver/index.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -170,7 +170,7 @@ def handler(event, context):
170170
# Invoke a model
171171
response = client.invoke_model(
172172
model_id=selectedModelId,
173-
system_prompt="You are an assistant that's responsible for getting details from document text attached here based on questions from the user.\n\nIf you don't know the answer, just say that you don't know. Don't try to make up an answer.\n\nAdditionally, use the user and assistant responses in the following JSON object to see what's been asked and what the resposes were in the past.\n\n",
173+
system_prompt="You are an assistant that's responsible for getting details from document text attached here based on questions from the user.\n\nIf you don't know the answer, just say that you don't know. Don't try to make up an answer.\n\nAdditionally, use the user and assistant responses in the following JSON object to see what's been asked and what the resposes were in the past. Your response should always be in plain text, not JSON.\n\n",
174174
content=content,
175175
temperature=0.0
176176
)

0 commit comments

Comments
 (0)