Remove APIs Remove Best practices Remove Events
article thumbnail

Use the ApplyGuardrail API with long-context inputs and streaming outputs in Amazon Bedrock

AWS Machine Learning

The new ApplyGuardrail API enables you to assess any text using your preconfigured guardrails in Amazon Bedrock, without invoking the FMs. In this post, we demonstrate how to use the ApplyGuardrail API with long-context inputs and streaming outputs. For example, you can now use the API with models hosted on Amazon SageMaker.

APIs 128
article thumbnail

Best practices to build generative AI applications on AWS

AWS Machine Learning

Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon via a single API. This is because such tasks require organization-specific data and workflows that typically need custom programming.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

Training large language models on Amazon SageMaker: Best practices

AWS Machine Learning

In this post, we dive into tips and best practices for successful LLM training on Amazon SageMaker Training. The post covers all the phases of an LLM training workload and describes associated infrastructure features and best practices. Some of the best practices in this post refer specifically to ml.p4d.24xlarge

article thumbnail

Best practices for building robust generative AI applications with Amazon Bedrock Agents – Part 1

AWS Machine Learning

This two-part series explores best practices for building generative AI applications using Amazon Bedrock Agents. This data provides a benchmark for expected agent behavior, including the interaction with existing APIs, knowledge bases, and guardrails connected with the agent.

article thumbnail

Best practices for load testing Amazon SageMaker real-time inference endpoints

AWS Machine Learning

This post describes the best practices for load testing a SageMaker endpoint to find the right configuration for the number of instances and size. For example, if you client is making the InvokeEndpoint API call over the internet, from the client’s perspective, the end-to-end latency would be internet + ModelLatency + OverheadLatency.

article thumbnail

Introducing multi-turn conversation with an agent node for Amazon Bedrock Flows (preview)

AWS Machine Learning

Amazon Bedrock Flows offers an intuitive visual builder and a set of APIs to seamlessly link foundation models (FMs), Amazon Bedrock features, and AWS services to build and automate user-defined generative AI workflows at scale. Test the flow Youre now ready to test the flow through the Amazon Bedrock console or API.

APIs 128
article thumbnail

Achieve operational excellence with well-architected generative AI solutions using Amazon Bedrock

AWS Machine Learning

Because this is an emerging area, best practices, practical guidance, and design patterns are difficult to find in an easily consumable basis. This integration makes sure enterprises can take advantage of the full power of generative AI while adhering to best practices in operational excellence.