Skip to content

AWS Bedrock Client for LLMs #102

@ThePyProgrammer

Description

@ThePyProgrammer

Something like a wrapper for this:

import boto3

session = boto3.Session(
    aws_access_key_id='<insert id>',
    aws_secret_access_key='<insert key>',
    region_name='<insert region>' # 'use-east-1'
)

client = session.client('bedrock-runtime', '<insert region>') # 'use-east-1'
def sut(prompt):
    conversation = [
        {
            "role": "user",
            "content": [{"text": prompt}],
        }
    ]

    response = client.converse(
        modelId="anthropic.claude-3-sonnet-20240229-v1:0",#"meta.llama2-13b-chat-v1",
        messages=conversation,
        inferenceConfig={"maxTokens":10,"temperature":0.5,"topP":0.9},
        additionalModelRequestFields={}
    )
    # Extract and print the response text.
    response_text = response["output"]["message"]["content"][0]["text"]
    return response_text

but with the chat and complete methods.

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions