article thumbnail

Use AWS PrivateLink to set up private access to Amazon Bedrock

AWS Machine Learning

It allows developers to build and scale generative AI applications using FMs through an API, without managing infrastructure. Customers are building innovative generative AI applications using Amazon Bedrock APIs using their own proprietary data.

APIs 122
article thumbnail

Use RAG for drug discovery with Knowledge Bases for Amazon Bedrock

AWS Machine Learning

The Retrieve and RetrieveAndGenerate APIs allow your applications to directly query the index using a unified and standard syntax without having to learn separate APIs for each different vector database, reducing the need to write custom index queries against your vector store.

APIs 110
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Use Amazon SageMaker pipeline sharing to view or manage pipelines across AWS accounts

AWS Machine Learning

You can now use cross-account support for Amazon SageMaker Pipelines to share pipeline entities across AWS accounts and access shared pipelines directly through Amazon SageMaker API calls. The data scientist is now able to describe and monitor the test pipeline run status using SageMaker API calls from the dev account.

article thumbnail

Identify objections in customer conversations using Amazon Comprehend to enhance customer experience without ML expertise

AWS Machine Learning

Since conversational AI has improved in recent years, many businesses have adopted cutting-edge technologies like AI-powered chatbots and AI-powered agent support to improve customer service while increasing productivity and lowering costs. API Gateway bypasses the request to Lambda. Lambda checks the format and stores it in DynamoDB.

APIs 61
article thumbnail

Enable fully homomorphic encryption with Amazon SageMaker endpoints for secure, real-time inferencing

AWS Machine Learning

Leidos is a FORTUNE 500 science and technology solutions leader working to address some of the world’s toughest challenges in the defense, intelligence, homeland security, civil, and healthcare markets. Applications and services can call the deployed endpoint directly or through a deployed serverless Amazon API Gateway architecture.

Scripts 93
article thumbnail

­­Speed ML development using SageMaker Feature Store and Apache Iceberg offline store compaction

AWS Machine Learning

SageMaker Feature Store automatically builds an AWS Glue Data Catalog during feature group creation. Customers can also access offline store data using a Spark runtime and perform big data processing for ML feature analysis and feature engineering use cases. Table formats provide a way to abstract data files as a table.

Scripts 72
article thumbnail

How Amp on Amazon used data to increase customer engagement, Part 2: Building a personalized show recommendation platform using Amazon SageMaker

AWS Machine Learning

It stores history of ML features in the offline store (Amazon S3) and also provides APIs to an online store to allow low-latency reads of most recent features. With purpose-built services, the Amp team was able to release the personalized show recommendation API as described in this post to production in under 3 months. Conclusion.