article thumbnail

Enable Amazon Bedrock cross-Region inference in multi-account environments

AWS Machine Learning

The customers AWS accounts that are allowed to use Amazon Bedrock are under an Organizational Unit (OU) called Sandbox. We want to enable the accounts under the Sandbox OU to use Anthropics Claude 3.5 Use case For our sample use case, we use Regions us-east-1 and us-west-2. Sonnet v2 model using cross-Region inference.

article thumbnail

Secure distributed logging in scalable multi-account deployments using Amazon Bedrock and LangChain

AWS Machine Learning

Some companies go to great lengths to maintain confidentiality, sometimes adopting multi-account architectures, where each customer has their data in a separate AWS account. In this post, we present a solution for securing distributed logging multi-account deployments using Amazon Bedrock and LangChain.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Amazon Bedrock Guardrails announces IAM Policy-based enforcement to deliver safe AI interactions

AWS Machine Learning

If a user assumes a role that has a specific guardrail configured using the bedrock:GuardrailIdentifier condition key, the user can strategically use input tags to help avoid having guardrail checks applied to certain parts of their prompt.

article thumbnail

Security best practices to consider while fine-tuning models in Amazon Bedrock

AWS Machine Learning

The workflow steps are as follows: The user submits an Amazon Bedrock fine-tuning job within their AWS account, using IAM for resource access. The fine-tuning job initiates a training job in the model deployment accounts. Provide your account, bucket name, and VPC settings. The following code is a sample resource policy.

article thumbnail

Best practices for Meta Llama 3.2 multimodal fine-tuning on Amazon Bedrock

AWS Machine Learning

Prerequisites To use this feature, make sure that you have satisfied the following requirements: An active AWS account. models enabled in your Amazon Bedrock account. This versatility allows organizations to improve performance across a range of input types with a single fine-tuned model. Meta Llama 3.2

article thumbnail

Amazon SageMaker JumpStart adds fine-tuning support for models in a private model hub

AWS Machine Learning

These can be added as inline policies in the users IAM role (use the Region configured in Step 3): { "Version": "2012-10-17", "Statement": [ { "Action": "s3:*", "Effect": "Deny", "Resource": [ "arn:aws:s3:::jumpstart-cache-prod- ", "arn:aws:s3:::jumpstart-cache-prod- /*" ], "Condition": { "StringNotLike": {"s3:prefix": ["*.ipynb",

APIs 110
article thumbnail

Combine keyword and semantic search for text and images using Amazon Bedrock and Amazon OpenSearch Service

AWS Machine Learning

Prerequisites For this walkthrough, you should have the following prerequisites: An AWS account. We encourage you to test the notebook in your own account and get firsthand experience with hybrid search variations. Ingest sample data to the OpenSearch Service index. Amazon Bedrock with Amazon Titan Multimodal Embeddings G1 enabled.