Pratyush Rai’s Post

View profile for Pratyush Rai, graphic

Founder at Merlin AI | Ex-BCG | IIT Kanpur | AI Maximalist

A single O3 query costs approx $1,000?! Sounds insane—until you see where this is headed. People assume that powerful AI will eventually get cheaper, but they don’t realize how soon it’s likely to happen. Think about GPT-3: in just three years, we saw the cost to run an equivalent model drop by 1,000x Now factor in exponentially better chips, and there’s little reason to believe the price curve won’t keep plunging. Yes, we all know correlation isn’t causation—but the direction is crystal clear. AI isn’t just going to be powerful and cheap. It’s going to be powerful, cheap, and arriving sooner than you might expect.

  • No alternative text description for this image

The environmental cost is going to be huge (well, it already is). We need to rethink how we train AI models, potentially by focusing on high-quality data rather than all the garbage we can get our hands on. The societal cost shouldn't also be neglected and might be serious already.

Leonard van Hemert

I make AI practical for SMBs, develop customized AI solutions and accelerate business processes with automation

5h

Spot on! With 1,000x cost drops and better chips, AI’s affordability is accelerating

Anuj Agrawal

CTO at BloomCode Technologies | Innovating in AI, Cloud and Software Solutions

1d

In my view, rather than generating increased revenue, AI seems to be causing significant financial strain and losses across the broader market. This assessment is not limited to individual entities but reflects the overall economic impact of AI development and deployment. A clear example of this is the impact on content creators, such as bloggers, who rely heavily on ad revenue platforms like Google AdSense. AI-generated content and tools like Google’s Search Generative Experience (SGE) are reducing traffic to original content, directly affecting revenue streams for writers. While Google integrates ads into AI-generated answers, the benefits primarily flow to Google, with little or no compensation for the content creators whose work informs these answers. This imbalance highlights a growing concern: AI is not only creating financial strain but also shifting value away from content creators to major platforms. A more equitable model that includes revenue sharing, proper attribution, and support for original creators is essential to address these challenges.

Ahmed Fadhl

Financial Analyst | Inclusive Market Development | Global Macro Investor | Crypto EA

1d

At Violet Cove we don't exaggerate the reality - AI is a tool to be utilized by competent 'AI Operators'. Did you know only 5-7% of businesses in the UAE have managed to find value from AI, despite 73% of businesses implementing AI. The issue here is that the tool doesn't do anything on it's own. Similar to a shovel during a gold rush, there needs to be someone using the tool effectively. At Violet Cove we don't sell AI products as a service, we look for the best shovel to fit your gold mine.

Chidhambararajan R (a.k.a Chidha)

Software Engineer - AI at JP Morgan | The Tech Buddha Podcast Host

2d

This is mainly because LLM inference (in pytorch inference) had extremely high gpu bottle necks, which was solved by kv cache, fast attention, fp16 and fp4 inference. Then we started to distill LLMs (the original gpt4 was around a trillion params but I bet gpt 4o series and forward doesn't have more than 200B params), then we started to use techniques like Hydra to not make the LLM predict every token (speculative decoding), and at the same time hardware price also dropped (effective price per flop you could say). I am not sure if more reduction can happen in the software side. But in terms of hardware there is more window especially companies like Groq and blowing it away.

Another snake oil salesmen with new packaging. Anything based token predictions on GenAI or LLMs will always be stochastic parrots or search engines for fancy summarization of well known facts. They can't reason n more importantly they can't say that they don't know the answer in case they are not sure. It's good for mundane works automation and billions of dollars burned in the GPU pushing bubble experiment by the corporates. Not to mention the impact on climate and carbon footprint with mundane computational training!

Aniket Bhatt

Meta || Animation XR and Production Design || VFX, Vis Dev and Virtual Production Artist & Technical Designer || CG Generalist || Screenwriter || Director || Passionate about Films, Comics and Video Games ||

6h

This thought process only works if you don’t understand the concept of a limit in an equation. Which is high school level math. The way it has grown in the past is not an indication of the way it will grow in the future

Like
Reply
Hargurjeet Singh Ganger

📊 Senior Data Scientist | 🤖 Specialized in AI Applications powered by LLMs 📚 | ✍️ Content Creator| 💡 Passionate about leveraging AI to drive innovation and solve complex challenges

7h

what excites me most isn’t just the affordability. It’s the accessibility that comes with it.

Like
Reply

While costs went down significantly - degree of that can't be determined by comparing retail prices ( as was done in an original tweet). I suspect that only some (likely few) insiders know true (marginal) costs.

Like
Reply
Amna D.

AI Product Manager Building AI & Data teams and Products for Fortune 500 companies | Business Automation with low code | Tech Sales

17h

All these new developments must to be tied to business usecases and how they impact the life of a common man. The pace of advancement is crazy and overwhelming to catch up with. We dont wanna build a monster we cant feed. I am not saying there aren’t enough problems. Ingredients are all on the table. There is no effective mechanism to audit who needs what to achieve the best results.

See more comments

To view or add a comment, sign in

Explore topics