Carlos Escapa’s Post

View profile for Carlos Escapa, graphic

Ex-Meta, Amazon Data & AI expert with a strong record in building industrial alliances. Committed to bridging the digital divide, I guest lecture to advocate for Open Science and broaden knowledge accessibility.

Meta reveals the carbon footprint for training Llama2 539 tonnes CO2e I'd like to send my gratitude to my former colleagues at Meta for the transparency shown in the release of Llama 2 this morning. They are definitely raising the bar. Total GPU-hours were 3.3m on A100-80GB at 400W, totalling 1.3GWh - the same as GPT-3. Meta disclosed that they offset the emissions, so downstream users need not count them in their scope 3 ☀ Kudos to Meta for also publishing a responsible use guide. Am looking forward to getting my hands on it! Model details: https://lnkd.in/ekQy4J4T Full model card: https://lnkd.in/eAmH2xfe

  • table
Carlos Escapa

Ex-Meta, Amazon Data & AI expert with a strong record in building industrial alliances. Committed to bridging the digital divide, I guest lecture to advocate for Open Science and broaden knowledge accessibility.

1y

You can try it at llama2.ai but I recommend that you don't raise the temperature 😉

Like
Reply
Jean-Baptiste Quéru

Software Architect at Aescape

1y

That's about the CO2 needed to fly a large passenger plane from southern Europe to Los Angeles and back.

Chris Kozak PhD

Genomics | Bioinformatics | Sustainability | Product Innovation

1y

This is smart! It has been so easy to overlook emissions from computing as an externality. We should appreciate transparent projects, and encourage emission estmates as a standard in scientific computing and SaaS.

See more comments

To view or add a comment, sign in

Explore topics