Compression
0
More-Efficient Machine Learning Models for On-Device Operation
0

Neural networks are responsible for most recent advances in artificial intelligence, including many of Alexa’s latest capabilities. But neural networks tend ...

0
How to Make Neural Language Models Practical for Speech Recognition
0

An automatic-speech-recognition system — such as Alexa’s — converts speech into text, and one of its key components is its language model. Given a sequence ...

0
Alexa at five: Looking back, looking forward
0

Today is the fifth anniversary of the launch of the Amazon Echo, so in a talk I gave yesterday at the Web Summit in Lisbon, I looked at how far Alexa has ...

0
Teaching neural networks to compress images
0

Virtually all the images flying over the Internet are compressed to save bandwidth, and usually, the codecs — short for coder-decoder — that do the ...

0
How to make on-device speech recognition practical
0

Historically, Alexa’s automatic-speech-recognition models, which convert speech to text, have run in the cloud. But in recent years, we’ve been working to ...

0
On-device speech processing makes Alexa faster, lower-bandwidth
0

At Amazon, we always look to invent new technology for improving customer experience. One technology we have been working on at Alexa is on-device speech ...

0
Simplifying BERT-based models to increase efficiency, capacity
0

In recent years, many of the best-performing models in the field of natural-language processing (NLP) have been built on top of BERT language models. ...

0
Neural encoding enables more-efficient recovery of lost audio packets
0

Packet loss is a big problem for real-time voice communication over the Internet. Everyone has been in the situation where the network is becoming ...

0
Low-precision arithmetic makes robot localization more efficient
0

Simultaneous localization and mapping (SLAM) is the core technology of autonomous mobile robots. It involves simultaneously building a map of the robot’s ...

0
Compressing token-embedding matrices for language models
0

Pretrained language models (PLMs) like BERT, RoBERTa, and DeBERTa, when fine-tuned on task-specific data, have demonstrated exceptional performance across a ...

Rockstary Reviews
Logo
Shopping cart