Open Source AI Definition – Weekly update September 2nd

Favorite Share your thoughts about draft v0.0.9 @mkai added concerns about how OSI will address AI-generated content from both open and closed source models, given current legal rulings that such content cannot be copyrighted. He also suggests clarifying the difference between licenses for AI model parameters and the model itself

Read More
Shared by voicesofopensource September 2, 2024

Implementing advanced prompt engineering with Amazon Bedrock

Favorite Despite the ability of generative artificial intelligence (AI) to mimic human behavior, it often requires detailed instructions to generate high-quality and relevant content. Prompt engineering is the process of crafting these inputs, called prompts, that guide foundation models (FMs) and large language models (LLMs) to produce desired outputs. Prompt

Read More
Shared by AWS Machine Learning August 31, 2024

Best practices for prompt engineering with Meta Llama 3 for Text-to-SQL use cases

Favorite With the rapid growth of generative artificial intelligence (AI), many AWS customers are looking to take advantage of publicly available foundation models (FMs) and technologies. This includes Meta Llama 3, Meta’s publicly available large language model (LLM). The partnership between Meta and Amazon signifies collective generative AI innovation, and

Read More
Shared by AWS Machine Learning August 31, 2024

Provide a personalized experience for news readers using Amazon Personalize and Amazon Titan Text Embeddings on Amazon Bedrock

Favorite News publishers want to provide a personalized and informative experience to their readers, but the short shelf life of news articles can make this quite difficult. In news publishing, articles typically have peak readership within the same day of publication. Additionally, news publishers frequently publish new articles and want

Read More
Shared by AWS Machine Learning August 30, 2024

Accelerate Generative AI Inference with NVIDIA NIM Microservices on Amazon SageMaker

Favorite This post is co-written with Eliuth Triana, Abhishek Sawarkar, Jiahong Liu, Kshitiz Gupta, JR Morgan and Deepika Padmanabhan from NVIDIA.  At the 2024 NVIDIA GTC conference, we announced support for NVIDIA NIM Inference Microservices in Amazon SageMaker Inference. This integration allows you to deploy industry-leading large language models (LLMs) on

Read More
Shared by AWS Machine Learning August 30, 2024

Connect the Amazon Q Business generative AI coding companion to your GitHub repositories with Amazon Q GitHub (Cloud) connector

Favorite Incorporating generative artificial intelligence (AI) into your development lifecycle can offer several benefits. For example, using an AI-based coding companion such as Amazon Q Developer can boost development productivity by up to 30 percent. Additionally, reducing the developer context switching that stems from frequent interactions with many different development

Read More
Shared by AWS Machine Learning August 29, 2024