Remove Analytics Remove APIs Remove Entertainment Remove Metrics
article thumbnail

How Amp on Amazon used data to increase customer engagement, Part 1: Building a data analytics platform

AWS Machine Learning

Amp wanted a scalable data and analytics platform to enable easy access to data and perform machine leaning (ML) experiments for live audio transcription, content moderation, feature engineering, and a personal show recommendation service, and to inspect or measure business KPIs and metrics. Business intelligence (BI) and analytics.

article thumbnail

Live Meeting Assistant with Amazon Transcribe, Amazon Bedrock, and Knowledge Bases for Amazon Bedrock

AWS Machine Learning

When the user is authenticated, the web application establishes a secure GraphQL connection to the AWS AppSync API, and subscribes to receive real-time events such as new calls and call status changes for the meetings list page, and new or updated transcription segments and computed analytics for the meeting details page.

APIs 111
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

How Amp on Amazon used data to increase customer engagement, Part 2: Building a personalized show recommendation platform using Amazon SageMaker

AWS Machine Learning

This is Part 2 of a series on using data analytics and ML for Amp and creating a personalized show recommendation list platform. The platform has shown a 3% boost to customer engagement metrics tracked (liking a show, following a creator, enabling upcoming show notifications) since its launch in May 2022. Solution overview.

article thumbnail

­­Speed ML development using SageMaker Feature Store and Apache Iceberg offline store compaction

AWS Machine Learning

Apache Iceberg is an open table format for very large analytic datasets. It manages large collections of files as tables, and it supports modern analytical data lake operations such as record-level insert, update, delete, and time travel queries. put_record API to ingest individual records or to handle streaming sources.

Scripts 73
article thumbnail

Face-off Probability, part of NHL Edge IQ: Predicting face-off winners in real time during televised games

AWS Machine Learning

The decision tree provided the cut-offs for each metric, which we included as rules-based logic in the streaming application. At the end, we found that the LightGBM model worked best with well-calibrated accuracy metrics. The second important component of the architecture is Amazon Kinesis Data Analytics for Apache Flink.

article thumbnail

FMOps/LLMOps: Operationalize generative AI and differences with MLOps

AWS Machine Learning

These teams are as follows: Advanced analytics team (data lake and data mesh) – Data engineers are responsible for preparing and ingesting data from multiple sources, building ETL (extract, transform, and load) pipelines to curate and catalog the data, and prepare the necessary historical data for the ML use cases.

article thumbnail

Architect defense-in-depth security for generative AI applications using the OWASP Top 10 for LLMs

AWS Machine Learning

Consider inserting AWS Web Application Firewall (AWS WAF) in front to protect web applications and APIs from malicious bots , SQL injection attacks, cross-site scripting (XSS), and account takeovers with Fraud Control. He recharges through reading, traveling, food and wine, discovering new music, and advising early-stage startups.