Ghee Leng Ooi’s Post

View profile for Ghee Leng Ooi, graphic

Co-founder of Embedded LLM | Follow me to learn how to make GenAI work 120% better for you!

🚀 Exciting times in AI! Ever heard of Chain of Thought (CoT) prompting? It's like giving AI a step-by-step playbook to solve complex problems, much like planning your dream vacation from start to finish. 🌍✈️ For simpler tasks, like sensing if a review is positive or negative, regular AI does the trick. But for anything that needs a sequence of decisions— think of it as setting up dominos to fall just right — that's where CoT shines. Why does this matter for us in business? It means a well-thought serial CoT can handle intricate tasks in your organization with ease. What are the tasks that come straight to mind that you think CoT can help with? Let's discuss!🌟 #AI #Innovation #BusinessTransformation #ChainOfThought

View organization page for Embedded LLM, graphic

5,251 followers

Why CoT Works: Iterative Computation for Serial Problems Ever wondered when to use fancy Chain of Thought (CoT) prompting, and when regular ChatGPT will do? Here's the key: Serial problems NEED CoT. Turns out, without CoT, we can only solve problems that have fast, parallel solutions. Let's get practical: - Travel Planning: Imagine an AI travel agent. CoT lets the AI figure out your destination, considering your preferences and budget. Then, it factors in dates (weather and your schedule), books your stay, and suggests activities based on the destination's attractions and your interests – each step builds on the last, ensuring a perfect trip. This is a classic example of a serial problem – each step depends on the outcome of the previous one, making it impossible to ask five people to do these steps in parallel. - Sentiment Analysis: This one ISN'T serial. This is a parallel task! A regular transformer can analyze a review for positive/negative/neutral vibes all at once. Imagine it like asking multiple people to independently assess the sentiment of the text simultaneously. They don't need to know the results of each other's analysis or follow any specific order. The Tech Deep Dive (Simplified!) CoT basically lets transformers tackle way more complex problems. Without CoT, they're limited to AC0 problems (solvable by constant-depth circuits). But add CoT, and now we're talking problems solvable by log-depth circuits (NC1 complete problems). Want proof? Check out the paper "Chain of Thought Empowers Transformers to Solve Inherently Serial Problems" (https://lnkd.in/giCaSGA5). Author: Zhiyuan Li, Hong Liu, Denny Zhou , and Tengyu Ma The Takeaway: Next time you've got a problem for your transformer, ask: Is this serial (step-by-step), or can it be done in parallel? That'll tell you if you need the power of CoT. #ChainOfThought #Transformers #CoT #LLM #AI

  • No alternative text description for this image

To view or add a comment, sign in

Explore topics