Hustle Badger’s Post

How much impact has your product team had? 🤔 That’s a surprisingly tricky question to answer. Sure, A/B tests are great … … assuming you’ve got the traffic ... ... and you’re measuring a single feature. But A/B tests aren’t great for measuring performance across multiple initiatives or several teams. A/B test results do NOT stack up. • 5x 2% uplift in A/B tests does not lead to a 10% increase overall. • Double counting is difficult to exclude. So how to we attribute product impact? Some thoughts below, but first: • Why do you need to know? • What decisions will this inform? It’s easy to get sucked into a laborious process of attribution seeking the Truth. In most cases, an approximation will do. Seeking a perfect answer vs. a "good enough" answer is busywork. 𝟱 𝗟𝗘𝗩𝗘𝗟𝗦 𝗢𝗙 𝗔𝗧𝗧𝗥𝗜𝗕𝗨𝗧𝗜𝗢𝗡 Ideally you build up analysis linking leading product metrics in your control to lagging business critical metrics like revenue. This is done in stages: 𝟭 - 𝗩𝗜𝗕𝗘𝗦 You don’t attribute. “It’s a team sport, who cares?” If you’re very early stage and things are going well, do you even need to attribute? 𝟮 - 𝗡𝗔𝗥𝗥𝗔𝗧𝗜𝗩𝗘 You can describe the mechanism, but not put numbers to it. “Commenting drives retention, retention drives revenue” 𝟯 - 𝗠𝗘𝗧𝗥𝗜𝗖𝗦 You have a driver tree with specific metrics, but you don’t know how much moving one moves another. “# comments a day drives D30 retention. D30 retention drives revenue per active user” 𝟰 - 𝗖𝗢𝗥𝗥𝗘𝗟𝗔𝗧𝗜𝗢𝗡 Most companies aim here. You know the link between your metrics. “10% increase in comments a day equals 3% increase in D30 retention, and 2% increase in revenue per active user” 𝟱 - 𝗛𝗢𝗟𝗗 𝗢𝗨𝗧 𝗚𝗥𝗢𝗨𝗣 You hold back all features from a subset of users for several months. (Like a prolonged A/B test) This is very expensive, but very rigorous. 𝗚𝗘𝗡𝗘𝗥𝗔𝗟 𝗧𝗜𝗣𝗦 • Work with an analyst / the CFO to come up with a simple model that everyone can buy into. • It's more important to have political buy-in than an absolutely scientifically correct answer. • Standardise the baseline so you can compare across teams and initiatives (e.g. last year’s actual revenue / traffic) • Be conscious of when you’ll actually release features, and how many months contribution they’ll deliver this year. Often shipping something in Q1 means you'll see impact from Q2. • Agree how you'll count future years. Product changes are often small but permanent, affecting all future revenue. More on different types of quantitative testing here on Hustle Badger: https://lnkd.in/eSaJ9jt8

  • table
Ian McGavin

Growth Ninja 🥷 | Master of Mobile Marketing 🔥 | User Acquisition & Retention Guru 📲 | Turning Downloads into Die-Hard Fans 🚀

2w

Attributing impact in product teams can indeed be tricky. Your breakdown of attribution levels is a fantastic guide for navigating this complexity. Great insights on balancing rigorous analysis with practical approximations!

Like
Reply

To view or add a comment, sign in

Explore topics