Customer interactions with Alexa are constantly growing more complex, and on the Alexa science team, we strive to stay ahead of the curve by continuously ...
Earlier this year, we reported a speech recognition system trained on a million hours of data, a feat possible through semi-supervised learning, in which ...
At next week’s Interspeech, the largest conference on the science and technology of spoken-language processing, Alexa researchers have 16 papers, which span ...
Yesterday at Amazon Web Services’ (AWS’s) annual re:Invent conference, Swami Sivasubramanian, vice president of machine learning at AWS, announced two new ...
As Amazon Scholar Chandan Reddy recently observed, graph neural networks are a hot topic at this year’s Conference on Knowledge Discovery and Data Mining ...
Most modern natural-language-processing applications are built on top of pretrained language models, which encode the probabilities of word sequences for ...
State-of-the-art language models have billions of parameters. Training these models within a manageable time requires distributing the workload across a ...
Graphs are a useful way to represent data, since they capture connections between data items, and graph neural networks (GNNs) are an increasingly popular ...
In an Amazon Science blog post earlier this summer, we presented MiCS, a method that significantly improves the training efficiency of machine learning ...