EnCharge AI’s Post

“According to the International Energy Agency (IEA), in 2022, data centres consumed 1.65 billion gigajoules of electricity — about 2% of global demand. Widespread deployment of AI will only increase electricity use. By 2026, the agency projects that data centres’ energy consumption will have increased by between 35% and 128% — amounts equivalent to adding the annual energy consumption of Sweden at the lower estimate or Germany at the top end.” In Nature, Katherine Bourzac interviewed our CEO, Naveen Verma, about how our analog in-memory computing chips are tackling the massive energy demands of AI. By leveraging SRAM technology with crossed metal wires as capacitors, our chips can execute machine learning algorithms at 150 tera operations per second per watt (TOPS)—a significant leap in energy efficiency over the industry standard of 24 TOPS.

Fixing AI's energy crisis

Fixing AI's energy crisis

nature.com

To view or add a comment, sign in

Explore topics