Remove Big data Remove Examples Remove Government Remove Scripts
article thumbnail

Amazon SageMaker Feature Store now supports cross-account sharing, discovery, and access

AWS Machine Learning

For example, in an application that recommends a music playlist, features could include song ratings, listening duration, and listener demographics. SageMaker Feature Store now allows granular sharing of features across accounts via AWS RAM, enabling collaborative model development with governance.

article thumbnail

Enable fully homomorphic encryption with Amazon SageMaker endpoints for secure, real-time inferencing

AWS Machine Learning

The predictions (inference) use encrypted data and the results are only decrypted by the end consumer (client side). To demonstrate this, we show an example of customizing an Amazon SageMaker Scikit-learn, open sourced, deep learning container to enable a deployed endpoint to accept client-side encrypted inference requests.

Scripts 100
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

­­Speed ML development using SageMaker Feature Store and Apache Iceberg offline store compaction

AWS Machine Learning

Customers can also access offline store data using a Spark runtime and perform big data processing for ML feature analysis and feature engineering use cases. Table formats provide a way to abstract data files as a table. The table will be created and registered automatically in the AWS Glue Data Catalog.

Scripts 77
article thumbnail

Architect defense-in-depth security for generative AI applications using the OWASP Top 10 for LLMs

AWS Machine Learning

Lastly, we connect these together with an example LLM workload to describe an approach towards architecting with defense-in-depth security across trust boundaries. In addition to awareness, your teams should take action to account for generative AI in governance, assurance, and compliance validation practices.

article thumbnail

Use RAG for drug discovery with Knowledge Bases for Amazon Bedrock

AWS Machine Learning

Before you can write scripts that use the Amazon Bedrock API, you’ll need to install the appropriate version of the AWS SDK in your environment. Information that identifies you may be shared with doctors responsible for your care or for audits and evaluations by government agencies, but talks and papers about the study will not identify you.

APIs 118
article thumbnail

Build repeatable, secure, and extensible end-to-end machine learning workflows using Kubeflow on AWS

AWS Machine Learning

Prior to our adoption of Kubeflow on AWS, our data scientists used a standardized set of tools and a process that allowed flexibility in the technology and workflow used to train a given model. Each project maintained detailed documentation that outlined how each script was used to build the final model.

article thumbnail

Four approaches to manage Python packages in Amazon SageMaker Studio notebooks

AWS Machine Learning

A public GitHub repo provides hands-on examples for each of the presented approaches. When you open a notebook in Studio, you are prompted to set up your environment by choosing a SageMaker image, a kernel, an instance type, and, optionally, a lifecycle configuration script that runs on image startup.