Remove Accountability Remove APIs Remove Engineering Remove SaaS
article thumbnail

Integrate SaaS platforms with Amazon SageMaker to enable ML-powered applications

AWS Machine Learning

A number of AWS independent software vendor (ISV) partners have already built integrations for users of their software as a service (SaaS) platforms to utilize SageMaker and its various features, including training, deployment, and the model registry.

SaaS 74
article thumbnail

How Vericast optimized feature engineering using Amazon SageMaker Processing

AWS Machine Learning

One aspect of this data preparation is feature engineering. Feature engineering refers to the process where relevant variables are identified, selected, and manipulated to transform the raw data into more useful and usable forms for use with the ML algorithm used to train a model and perform inference against it.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Simplify iterative machine learning model development by adding features to existing feature groups in Amazon SageMaker Feature Store

AWS Machine Learning

Feature engineering is one of the most challenging aspects of the machine learning (ML) lifecycle and a phase where the most amount of time is spent—data scientists and ML engineers spend 60–70% of their time on feature engineering. Data scientists and ML engineers need to easily add features to an existing feature group.

APIs 83
article thumbnail

InformedIQ automates verifications for Origence’s auto lending using machine learning

AWS Machine Learning

To classify and extract information needed to validate information in accordance with a set of configurable funding rules, Informed uses a series of proprietary rules and heuristics, text-based neural networks, and image-based deep neural networks, including Amazon Textract OCR via the DetectDocumentText API and other statistical models.

APIs 71
article thumbnail

Architect defense-in-depth security for generative AI applications using the OWASP Top 10 for LLMs

AWS Machine Learning

The goal of this post is to empower AI and machine learning (ML) engineers, data scientists, solutions architects, security teams, and other stakeholders to have a common mental model and framework to apply security best practices, allowing AI/ML teams to move fast without trading off security for speed.

article thumbnail

Boomi uses BYOC on Amazon SageMaker Studio to scale custom Markov chain implementation

AWS Machine Learning

Boomi is an enterprise-level software as a service (SaaS) independent software vendor (ISV) that creates developer enablement tooling for software engineers. These tools integrate via API into Boomi’s core service offering. Markov chains are best known for their applications in web crawling and search engines.

article thumbnail

Build a medical imaging AI inference pipeline with MONAI Deploy on AWS

AWS Machine Learning

We have developed a MONAI Deploy connector to AHI to integrate medical imaging AI applications with subsecond image retrieval latencies at scale powered by cloud-native APIs. AHI provides API access to ImageSet metadata and ImageFrames. These two options cover a majority of near-real-time medical imaging inference pipeline use cases.