We all want GenAI, agentic human like intelligent automation across enterprise functions. But at what cost ? Is this hype only for productivity so that humans will need to do less of what we do today ? Or should GenAI accelerate 'save the world' kind of actions that humans have given up on, given the ROI of energy consumption stacks up. Should our AI aspirations draw selfishly from our next generation reserves ? Every LLM query consumes energy that could light up a bulb for 20 mins. GenAI will create 40% higher energy demand in the next decade than without it. Meta is running moldes on 100k+ H100 Nvidia chipsets, but should they be proud of it or concerned ? First industrial revolution deserved the fossil fuel as the end was not visible then and the climate threat was unknown. Now it is entirely different world, we must do double for environment before we consume energy for greed of growth and profit. Amazon, Meta, Google all buying and producing nuclear energy to justify the LLM model builds and trainings, the reasoning intelligence to get closer to illusive AGI. But nuclear is not green and then the capacity must be fed to main supply grid as opposed to data centre AI racks. Personal arc reactors is not a sustainable model. Should we not mandate double quantum of green energy feed first to the grid and then allow to consume additional power for AI aspirations ? AI be sustainable and responsible!
Ironically, while AI is being deployed to optimize processes, enable predictive maintenance, improve production efficiency, and refine route planning to reduce carbon footprints and emissions, the models themselves require massive energy to operate. I completely agree — it’s time for global governing bodies to step in and enforce mandates that ensure IT and AI development adhere to sustainable practices.
Well said. How big H100 DCs being created by each is a new show off competition. Have we seen really many situations where Customer is getting better value? Does customer really care about AI or any technology? What is the quality of results thrown by AI? And finally at what financial and Environmental cost?
Very aptly put Pallab Bhattacharya I think one of the key challange would be very very high power consumption along with others like data quality feeding model generations. As we see hyper scalers started to look for nuclear power to support the demand of LLM and LAM's. Direct impact to ESG.
Valid point
Well said Pallab
Business Excellence/ Delivery Excellence Professional - Helping Businesses to achieve transformation through Quality, Process Reengineering and Digital Transformation
1moGreat perspective Pallab. Thought provoking article. Not seen any measure around increase in power consumption as we use more and more AI solutions. I think we must have this measure in place. Usage of AI , it is becoming a need in the fast moving world and difficult for us to ask people not to use., rather we must have a framework or methodology to decide when to follow AI track and when to follow normal track.