Building a virtual meteorologist using Amazon Bedrock Agents

Favorite The integration of generative AI capabilities is driving transformative changes across many industries. Although weather information is accessible through multiple channels, businesses that heavily rely on meteorological data require robust and scalable solutions to effectively manage and use these critical insights and reduce manual processes. This solution demonstrates how

Read More
Shared by AWS Machine Learning February 12, 2025

Meta SAM 2.1 is now available in Amazon SageMaker JumpStart

Favorite This blog post is co-written with George Orlin from Meta. Today, we are excited to announce that Meta’s Segment Anything Model (SAM) 2.1 vision segmentation model is publicly available through Amazon SageMaker JumpStart to deploy and run inference. Meta SAM 2.1 provides state-of-the-art video and image segmentation capabilities in

Read More
Shared by AWS Machine Learning February 12, 2025

From concept to reality: Navigating the Journey of RAG from proof of concept to production

Favorite Generative AI has emerged as a transformative force, captivating industries with its potential to create, innovate, and solve complex problems. However, the journey from a proof of concept to a production-ready application comes with challenges and opportunities. Moving from proof of concept to production is about creating scalable, reliable,

Read More
Shared by AWS Machine Learning February 12, 2025

LLM-as-a-judge on Amazon Bedrock Model Evaluation

Favorite The evaluation of large language model (LLM) performance, particularly in response to a variety of prompts, is crucial for organizations aiming to harness the full potential of this rapidly evolving technology. The introduction of an LLM-as-a-judge framework represents a significant step forward in simplifying and streamlining the model evaluation

Read More
Shared by AWS Machine Learning February 12, 2025

Fine-tune LLMs with synthetic data for context-based Q&A using Amazon Bedrock

Favorite There’s a growing demand from customers to incorporate generative AI into their businesses. Many use cases involve using pre-trained large language models (LLMs) through approaches like Retrieval Augmented Generation (RAG). However, for advanced, domain-specific tasks or those requiring specific formats, model customization techniques such as fine-tuning are sometimes necessary.

Read More
Shared by AWS Machine Learning February 12, 2025