Explore advanced techniques for hyperparameter optimization with Amazon SageMaker Automatic Model Tuning

Favorite Creating high-performance machine learning (ML) solutions relies on exploring and optimizing training parameters, also known as hyperparameters. Hyperparameters are the knobs and levers that we use to adjust the training process, such as learning rate, batch size, regularization strength, and others, depending on the specific model and task at

Read More
Shared by AWS Machine Learning November 11, 2023

Customizing coding companions for organizations

Favorite Generative AI models for coding companions are mostly trained on publicly available source code and natural language text. While the large size of the training corpus enables the models to generate code for commonly used functionality, these models are unaware of code in private repositories and the associated coding

Read More
Shared by AWS Machine Learning November 10, 2023

Promote pipelines in a multi-environment setup using Amazon SageMaker Model Registry, HashiCorp Terraform, GitHub, and Jenkins CI/CD

Favorite Building out a machine learning operations (MLOps) platform in the rapidly evolving landscape of artificial intelligence (AI) and machine learning (ML) for organizations is essential for seamlessly bridging the gap between data science experimentation and deployment while meeting the requirements around model performance, security, and compliance. In order to

Read More
Shared by AWS Machine Learning November 10, 2023

Optimize for sustainability with Amazon CodeWhisperer

Favorite This post explores how Amazon CodeWhisperer can help with code optimization for sustainability through increased resource efficiency. Computationally resource-efficient coding is one technique that aims to reduce the amount of energy required to process a line of code and, as a result, aid companies in consuming less energy overall.

Read More
Shared by AWS Machine Learning November 9, 2023

Harnessing the power of enterprise data with generative AI: Insights from Amazon Kendra, LangChain, and large language models

Favorite Large language models (LLMs) with their broad knowledge, can generate human-like text on almost any topic. However, their training on massive datasets also limits their usefulness for specialized tasks. Without continued learning, these models remain oblivious to new data and trends that emerge after their initial training. Furthermore, the

Read More
Shared by AWS Machine Learning November 8, 2023