-
Notifications
You must be signed in to change notification settings - Fork 8
Open
Labels
llm-supportSupport for LLMsSupport for LLMs
Description
Something like a wrapper for this:
import boto3
session = boto3.Session(
aws_access_key_id='<insert id>',
aws_secret_access_key='<insert key>',
region_name='<insert region>' # 'use-east-1'
)
client = session.client('bedrock-runtime', '<insert region>') # 'use-east-1'
def sut(prompt):
conversation = [
{
"role": "user",
"content": [{"text": prompt}],
}
]
response = client.converse(
modelId="anthropic.claude-3-sonnet-20240229-v1:0",#"meta.llama2-13b-chat-v1",
messages=conversation,
inferenceConfig={"maxTokens":10,"temperature":0.5,"topP":0.9},
additionalModelRequestFields={}
)
# Extract and print the response text.
response_text = response["output"]["message"]["content"][0]["text"]
return response_textbut with the chat and complete methods.
Metadata
Metadata
Assignees
Labels
llm-supportSupport for LLMsSupport for LLMs