Machine learning (ML) models need regular updates to improve performance, but retraining a model poses risks, such as the loss of backward compatibility or ...
Earlier this year, we released MASSIVE, a million-record natural-language-understanding (NLU) dataset composed of human-translated utterances spanning 51 ...
A quick guide to Amazon’s innovative work at the IEEE Spoken Language Technology Workshop (SLT), which begins next week:Accelerator-aware training for ...
Knowledge distillation is a popular technique for compressing large machine learning models into manageable sizes, to make them suitable for low-latency ...
The machine learning models that power conversational agents like Alexa are typically trained on labeled data, but data collection and labeling are ...
In October of 2021, Amazon and MIT announced the establishment of the Science Hub. That collaboration, which aims to ensure the benefits of AI and robotics ...
Amazon today announced that a team from the University of Michigan has won the Alexa Prize SimBot Challenge. The SimBot Challenge's goal is to advance the ...
Put your hand up if you enjoy using your TV remote to type in the name of the show you want to watch next. Who doesn’t love shuffling the highlighted box ...
To hear Shrikanth Narayanan describe it, every single human conversation is a feat of engineering — a complex system for creating and interpreting a ...
Between the main conference and the recently inaugurated ACL Proceedings, Amazon researchers have more than 65 papers at this year's meeting of the ...