Rapid ML experimentation for enterprises with Amazon SageMaker AI and Comet

Favorite This post was written with Sarah Ostermeier from Comet. As enterprise organizations scale their machine learning (ML) initiatives from proof of concept to production, the complexity of managing experiments, tracking model lineage, and managing reproducibility grows exponentially. This is primarily because data scientists and ML engineers constantly explore different

Read More
Shared by AWS Machine Learning September 23, 2025

Move your AI agents from proof of concept to production with Amazon Bedrock AgentCore

Favorite Building an AI agent that can handle a real-life use case in production is a complex undertaking. Although creating a proof of concept demonstrates the potential, moving to production requires addressing scalability, security, observability, and operational concerns that don’t surface in development environments. This post explores how Amazon Bedrock AgentCore

Read More
Shared by AWS Machine Learning September 20, 2025

3 ways to use photo-to-video in Gemini

Favorite Here’s how I’ve been using Gemini’s photo-to-video tool as a multimedia storyteller, plus some tips for making your own videos. View Original Source (blog.google/technology/ai/) Here.

Use AWS Deep Learning Containers with Amazon SageMaker AI managed MLflow

Favorite Organizations building custom machine learning (ML) models often have specialized requirements that standard platforms can’t accommodate. For example, healthcare companies need specific environments to protect patient data while meeting HIPAA compliance, financial institutions require specific hardware configurations to optimize proprietary trading algorithms, and research teams need flexibility to experiment

Read More
Shared by AWS Machine Learning September 19, 2025

Build Agentic Workflows with OpenAI GPT OSS on Amazon SageMaker AI and Amazon Bedrock AgentCore

Favorite OpenAI has released two open-weight models, gpt-oss-120b (117 billion parameters) and gpt-oss-20b (21 billion parameters), both built with a Mixture of Experts (MoE) design and a 128K context window. These models are the leading open source models, according to Artificial Analysis benchmarks, and excel at reasoning and agentic workflows.

Read More
Shared by AWS Machine Learning September 18, 2025