Remove Accountability Remove APIs Remove Best practices Remove Scripts
article thumbnail

Build a cross-account MLOps workflow using the Amazon SageMaker model registry

AWS Machine Learning

When designing production CI/CD pipelines, AWS recommends leveraging multiple accounts to isolate resources, contain security threats and simplify billing-and data science pipelines are no different. Some things to note in the preceding architecture: Accounts follow a principle of least privilege to follow security best practices.

article thumbnail

Architect defense-in-depth security for generative AI applications using the OWASP Top 10 for LLMs

AWS Machine Learning

The goal of this post is to empower AI and machine learning (ML) engineers, data scientists, solutions architects, security teams, and other stakeholders to have a common mental model and framework to apply security best practices, allowing AI/ML teams to move fast without trading off security for speed.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Optimize AWS Inferentia utilization with FastAPI and PyTorch models on Amazon EC2 Inf1 & Inf2 instances

AWS Machine Learning

If the model changes on the server side, the client has to know and change its API call to the new endpoint accordingly. In this post, we share best practices to deploy deep learning models with FastAPI on AWS Inferentia NeuronCores. script in the fastapi and trace-model folders use this to create Docker images.

Scripts 72
article thumbnail

Generate customized, compliant application IaC scripts for AWS Landing Zone using Amazon Bedrock

AWS Machine Learning

Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon with a single API, along with a broad set of capabilities to build generative AI applications with security, privacy, and responsible AI.

Scripts 94
article thumbnail

Build custom code libraries for your Amazon SageMaker Data Wrangler Flows using AWS Code Commit

AWS Machine Learning

Therefore, organizations have adopted technology best practices, including microservice architecture, MLOps, DevOps, and more, to improve delivery time, reduce defects, and increase employee productivity. This post introduces a best practice for managing custom code within your Amazon SageMaker Data Wrangler workflow.

Scripts 62
article thumbnail

Configure an AWS DeepRacer environment for training and log analysis using the AWS CDK

AWS Machine Learning

An administrator can run the AWS CDK script provided in the GitHub repo via the AWS Management Console or in the terminal after loading the code in their environment. With the help of the AWS CDK, we can version control our provisioned resources and have a highly transportable environment that complies with enterprise-level best practices.

Scripts 73
article thumbnail

Enable fully homomorphic encryption with Amazon SageMaker endpoints for secure, real-time inferencing

AWS Machine Learning

Applications and services can call the deployed endpoint directly or through a deployed serverless Amazon API Gateway architecture. To learn more about real-time endpoint architectural best practices, refer to Creating a machine learning-powered REST API with Amazon API Gateway mapping templates and Amazon SageMaker.

Scripts 93