Five years ago this November, at its annual re:Invent conference, Amazon Web Services (AWS) announced the release of a new service called Amazon SageMaker, which enabled customers to quickly and easily build, train, and deploy machine learning models. It also announced four services that applied machine learning to specific tasks: Amazon Transcribe, Amazon Translate, Amazon Comprehend, and Amazon Rekognition Video.
Since then, SageMaker has been one of the fastest-growing services in AWS’s history. But at the time, it wasn’t obvious that there would be a strong customer appetite for a fully managed machine learning service.
“ML was already big, and it was clear that it was going to get bigger,” says Bratin Saha, AWS vice president for machine learning (ML) and artificial intelligence (AI) services. “But building and training models was complex and required considerable expertise. We didn’t know how much demand there would be. But we had had success with ML internally and wanted to bring that opportunity to our customers.”
And of course, one of SageMaker’s aims was to make ML easier. The first version of the service included implementations of the widely used ML algorithms, automatic hyperparameter tuning, native integration with popular ML frameworks such as TensorFlow, and easy model deployment. “It eliminated the heavy lifting involved in managing ML infrastructure, performing health checks, applying security patches, and conducting other routine maintenance,” Saha says.
In the event, Saha says, he was surprised by the rapidity of SageMaker’s adoption. At the time, there were several other ML services already on the market. “We started behind everyone, but we grew faster,” Saha says.
In part, that was because of AWS’s well-established expertise in system design. AWS engineers were able to devise extremely efficient implementations of core ML frameworks like PyTorch and TensorFlow, and AWS researchers continue to develop more and more efficient methods to distribute the training of large models across multiple GPUs.
But in part, Saha emphasizes, it was also because of Amazon’s expertise in machine learning. “Amazon has been doing ML for decades,” Saha says. “Amazon was a pioneer in personalization and recommendations, and we’ve been using demand forecasting to optimize the supply chain for a long time.”
Democratizing ML
And finally, Saha says, SageMaker’s rapid adoption was driven by AWS’s continuous development of new tools that make it easier to do ML in the cloud. For instance, AWS released special-purpose tools for customers working with specific types of data, such as images, text, geospatial data, or financial data.
Furthermore, Saha says, “We invented ML tools that hadn’t existed before — like collaboration tools, debugging tools, and the world’s first IDE [integrated development environment] for machine learning. The goal was to bring tools familiar in the software development world to ML.”
That invention continues unabated. Two weeks ago, at the latest re:Invent, AWS announced eight new SageMaker capabilities, including a data preparation capability that lets customers visually inspect data and address problems; automated model validation, which enables customers to test new models using real-time inference requests; and support for geospatial data, enabling customers to more easily develop machine learning models for climate science, urban planning, disaster response, retail planning, and precision agriculture, among other things.
Of course, it’s not just SageMaker’s anniversary. AWS’s AI services for speech-to-text conversion, translation, document processing, and video and image analysis are also celebrating five years of availability. And just as the SageMaker environment has expanded, so has AWS’s AI portfolio.
CodeWhisperer points toward a future in which customers won’t even have to know how to code in order to deploy complex services in the cloud.
One of the recent additions that most excites Saha is Amazon CodeWhisperer. With CodeWhisperer, customers can use ordinary language to specify operations they want their programs to perform, and, based on the existing code in the IDE, CodeWhisperer will suggest code for implementing them. CodeWhisperer also leverages Amazon’s expertise in automated reasoning to ensure that the resulting code is secure and behaves properly.
“All of these tools — SageMaker and all of the other services in the AI portfolio — are about making AI and ML mainstream,” Saha says. “CodeWhisperer points toward a future in which customers won’t even have to know how to code in order to deploy their own complex services in the cloud.”
An anniversary is an occasion to contemplate the future as well as the past, and while it’s difficult to predict what customers’ most pressing needs will be years from now, Saha is intrigued by the prospect of AI services with more autonomy than those available today.
“An interesting area is automation of various tasks,” Saha says. “We are already seeing applications like predictive maintenance of industrial equipment. But I think we will see more in terms of embedding machine learning into personal productivity tools — for example, machine learning that looks at your e-mails and summarizes them and prioritizes which ones you need to respond to first, updates your calendar, takes care of repetitive tasks like updating spreadsheets, or saving attachments, and such.
“Another coming trend is AI that creates content for you, like movies or poems. For instance, look at the Alexa story creation experience. Imagine that using natural language, you describe a video, and the AI creates it for you. That’s right over the horizon. And it’s taking democratization to another level, which is in keeping with the whole concept of AWS AI services.”