re:MARS revisited: Optimizing AI/ML workloads for sustainability


In June 2022, Amazon re:MARS, the company’s in-person event that explores advancements and practical applications within machine learning, automation, robotics, and space (MARS), took place in Las Vegas. The event brought together thought leaders and technical experts building the future of artificial intelligence and machine learning and included keynote talks, innovation spotlights, and a series of breakout session talks.

Now, in our re:MARS revisited series, Amazon Science is taking a look back at some of the keynotes and breakout session talks from the conference. We’ve asked presenters three questions about their talks, and we provide the full video of their presentations.

Focus on sustainability

Amazon advocates for updating carbon accounting to measure where renewable-energy projects will have the greatest impact.

On June 27, Amogh Gaikwad, solutions developer with Amazon Web Services (AWS), presented the talk “Optimizing AI/ML workloads for sustainability”. His session focused on best practices for efficiently retraining multiple machine learning models using minimal computational resources and computationally efficient built-in algorithms.

What was the central theme of your presentation?

Building and training machine learning models with higher accuracy can be an energy-intensive process, requiring large computational resources that necessitate substantial energy consumption. This session explores guidance from the sustainability pillar of the AWS Well-Architected Framework to reduce the carbon footprint of AI/ML workloads.

Sustainability at Amazon

Pioneering web-based PackOpt tool has resulted in an annual reduction in cardboard waste of 7% to 10% in North America, saving roughly 60,000 tons of cardboard annually.

This guidance covers best practices for efficiently retraining multiple models using minimal computational resources and leveraging computationally efficient built-in algorithms. Additionally, customers can learn about the AWS tools available for monitoring models during training and deployment.

In what applications do you expect this work to have the biggest impact?

This guidance will have the biggest impact on machine learning applications that require large, energy intensive computational resources. In addition, the guidance is applicable for applications where the focus is on reducing carbon emissions and designing machine learning workloads with sustainability in mind.

What are the key points you hope audiences take away from your talk?

  • How to design ML workloads using the Well-Architected machine learning lifecycle and sustainability best practices
  • How to optimize resources for developing, training, and tuning ML models
  • How to reduce the environmental impact of machine learning workloads in production
  • Knowledge of AWS tools for monitoring machine learning workloads

Amazon re:MARS 2022: Optimizing AI/ML workloads for sustainability





Source link

We will be happy to hear your thoughts

Leave a reply

Rockstary Reviews
Logo
Shopping cart