Customer interactions with Alexa are constantly growing more complex, and on the Alexa science team, we strive to stay ahead of the curve by continuously ...
Neural networks are responsible for most recent advances in artificial intelligence, including many of Alexa’s latest capabilities. But neural networks tend ...
Sound detection is a popular application of today’s smart speakers. Alexa customers who activate Alexa Guard when they leave the house, for instance, ...
The models behind machine learning (ML) services are continuously being updated, and the new models are usually more accurate than the old ones. But an ...
Validation curves in a five-task multitask learning setup, where training minimizes the sum of the task losses. The tasks ...
Knowledge distillation is a popular technique for compressing large machine learning models into manageable sizes, to make them suitable for low-latency ...
Teaching large language models (LLMs) to reason is an active topic of research in natural-language processing, and a popular approach to that problem is the ...
Knowledge distillation (KD) is one of the most effective ways to deploy large-scale language models in environments where low latency is essential. KD ...
Geospatial technologies have rapidly ascended to a position of paramount importance across the globe. By providing a better understanding of Earth's ...
Large machine learning models based on the transformer architecture have recently demonstrated extraordinary results on a range of vision and language ...