Hyperparameter optimization for fine-tuning pre-trained transformer models from Hugging Face

Favorite Large attention-based transformer models have obtained massive gains on natural language processing (NLP). However, training these gigantic networks from scratch requires a tremendous amount of data and compute. For smaller NLP datasets, a simple yet effective strategy is to use a pre-trained transformer, usually trained in an unsupervised fashion

Read More
Shared by AWS Machine Learning June 29, 2022

Deep demand forecasting with Amazon SageMaker

Favorite Every business needs the ability to predict the future accurately in order to make better decisions and give the company a competitive advantage. With historical data, businesses can understand trends, make predictions of what might happen and when, and incorporate that information into their future plans, from product demand

Read More
Shared by AWS Machine Learning June 28, 2022