💻Some thoughts on fast-paced vs slow-paced ds projects: * Fast-paced model development is not always good. If our goal is to make effective model as well as 'right' product decisions, we may need to consider a lot of scenarios. For example, if we analyze the data and find that making one product change can improve monetization and click through rate, should we tell pm to implement it right now? Not really. How about long term influence, users may have some complaints but just cannot find another better product right now. But once the product comes, users will go and we never know why if we only look at 1-2 week experiment data. * Analyze data based on intuitive thinking. It's important that we use the same standard to analyze data, but is ctr decrease really bad as we think it is? In a relatively slow-paced ds projects, we may have more time to analyze if the decresed clicks are effective or not, and that may lead to a totally different result. It's easy to look at data and say we find some trends, but data is cold and needs our thoughts to make it 'warm' enough to make the right decision.
Xinyi (Larry) Le’s Post
More Relevant Posts
-
"Build small and then iterate based on analytics" is not a workable model for creating value. Research can't be replaced by analytics, because research isn't just "talking to users" - it's also hypothesis definition. Often, the more valuable part of research is planning: taking stock of what we already know, and determining how we will find out what we don't. "Talking to users" is only one way of learning what we don't know. Shipping software is another. But if you didn't prepare a proper research hypothesis, then you're going to be asking the wrong questions - so every way of gathering data will give you poor answers. And without knowing what answers you need, you have no real sense of what your "build small" actually needs to include. Most teams respond to this by bloating their builds with all sorts of "requirements" and "must-haves" and "P0" features that have no relation to what you are trying to learn, or to customer value. Predictably, this decreases velocity, and therefore the rate of learning, and therefore time-to-value. Taking the time to think through your research plan is going to make you much faster, overall.
To view or add a comment, sign in
-
REMINDER: Don't let data overwhelm you! 📊 Stop thinking data management is too complex. Instead, focus on implementing effective strategies to manage and retain valuable business data for long-term success. 🚀 Share your thoughts on data management or explore our affordable web design and content writing services today! 🌟 #DataManagement #BusinessSuccess #oconcomarketing #success #marketing #AI
To view or add a comment, sign in
-
Enhance your survey strategy with the power of Copilot! 📊 🤓 Feedback is a critical component in any type of organization. Be it a multimillion-dollar company or small business, understanding your customer base and employees is essential. This process not only helps us learn but also builds a sense of trust in the organization. But as we know traditional survey methods can be time consuming, but what if I tell you that with the help of AI you can easily create and conduct surveys and easily engage with your customer base or employees. Microsoft Copilot uses the power of AI to assist users in creating informative surveys. It helps with automating tasks and providing intelligent suggestions. · Tailored questions suggestions – Using Copilot in Microsoft Forms, you can easily type in your ideas and quickly generate relevant questions for the survey. Copilot will also suggest how to ask those questions, for example whether it should be a multiple choice or whether people should rate it based on different metrics etc. This makes the survey more responsive and less confusing. · Effective response analysis – After collecting responses you can easily download the data into an Excel sheet and there you can make use of copilot to analyze the results. You can easily transform the data into charts and tables and copilot can summarize the data and trends for you. Many organizations are making use of this amazing feature to enhance their productivity and customer base. As we are navigating through a data-driven world, it is important to understand the tools we use to gather data. There are number of techniques which you can learn which will help in transforming your business. So, stop wasting your valuable time on manual tasks, make use of these effective tools so that you can focus on interpreting the data and taking data-driven decisions. 😎 💫
To view or add a comment, sign in
-
🚨 Data visualisation for PBS and NDIS providers! Processing participants and service data and producing actionable info is not an easy task. Today, I am offering access to my proprietary graphing and data visualisation engine. The intuitive product can be integrated into your existing solutions/apps, customised to your data and entirely in-house. With this solution, your employees, clients, customers will have access to beautiful and actionable visualisations with zero spreadsheet skills required. Features: ✅ Find the data you need with natural language (click "last week" for last week's data). ✅ Role based data access. Employees, external stakeholders access data relevant to their workflow. ✅ Save visualisations in one click, use it in reports, presentations, BSPs. ✅ Full integration with data collection. Data is instantly visualised upon submission. ✅ Dashboard overview for managers. Send me a message to get a free mock up, just tell me what you need visualised! Licensing also available for data platforms and developers.
To view or add a comment, sign in
-
Check out this article that's a goldmine for product teams looking to level up. It's all about harnessing the power of data – but in ways you might not expect. It answers: ✅ Why collecting and actively using data from day one is crucial. ✅ Blending data insights with user feedback for a complete picture. ✅ Real-life examples of data guiding feature development and user education. This isn't just about data; it's about evolving how we make products that truly resonate with users. Click to read the full article #userfirst #productdevelopment #measurewhatmatters
To view or add a comment, sign in
-
7 BI tools worth looking into for ~$1K/mo budgets: Polymer - fast build speed, thoughtful UX, 2-click chart generation, AI-prompted chart optimization, diversity of charts, simple blank canvas Daydream - ideal for early-stage founders/operators, intuitive notion-like UI/folder structure, easily add text throughout dashboards, tag/comment @ others easily within dashboard Omni - great for growing data teams, built-in data modeling, seamless in-tool dbt integration, ideal for heavy SQL, embedded customer-facing analytics, advanced cross-team metric validation, tiered user permissions Hashboard - in-tool dbt model high-level view, easy SSO integration, seamless drilldowns, useful version control, developer-friendly yml editing Zenlytic - leading AI-powered BI chatbot for self-serve analytics, semantic layer modeling, easy dbt integration Lightdash - great for exploratory deep dives, governance over tiered user roles, saving/indexing views, slack-integrated dashboard alerts, engineer-friendly command-line interface/workflow Evidence - Business intelligence as code - super clean/interactive visualization aesthetics, developer-friendly coding in markdown, great for embedded customer-facing analytics
To view or add a comment, sign in
-
Conducting market and competitor research doesn't have to take hours... In fact, with automation, it should only take a few minutes. Whether you're researching for your next big startup idea or trying to give your current business a competitive edge, I've put together a system that can save you hours of manual effort. Here's how it works: 1. Feed your competitors and their websites into a spreadsheet. 2. Scrape their sites' HTML and parse it into text. 3. Use OpenAI to summarize the text from their sites. 4. Use Perplexity to search the web for key insights on the company, products, pricing, reviews, etc. 5. Use Anthropic to compile all the data, create a detailed competitor analysis, and identify areas of opportunity. 6. Push all the results back into your spreadsheet. Say goodbye to tedious data collection and hello to efficient, insightful research. Want to try it for yourself? Comment "Research" to get the blueprint👇
To view or add a comment, sign in
-
RAG systems aren't just about tech. Product design is crucial. https://lnkd.in/grbTBfyV Even the most sophisticated retrieval can fail without an intuitive interface. User needs trump fancy algorithms. Continuous improvement based on real feedback is the key to RAG success.
FAQ on Improving RAG Applications - jxnl.co
jxnl.co
To view or add a comment, sign in
-
One thing I am noticing in the product management / data quality space is metric trees - (e.g. https://lnkd.in/gxvHmqjD ) - the basic idea is breaking down a top metric (such as revenue) and drilling down to lower level KPIs to indicate the equations in the systems. What I am noticing is that these are very similar to causal decision diagrams in decision intelligence - particularly they're a way to break down outcomes, although they don't necessarily have levers (Levers in a CDD depend on the boundary of the decision - relative to product managers, that would be the factors, say, a Chief Product Officer would control). So they're clearly very related things except metric trees aren't necessarily thought of as more than a visual aid from what I am seeing. Causal Decision Diagrams are the foundation of decision models and decision simulations which can have machine learning models under the hood (any decision element that is fed other decision elements can be represented by an arbitrary function, which can be represented by a machine learning model) Something that's known in DI specifically is that where data quality (and fidelity) is more important is the SENSITIVITY of a particular metric to the outcome it's measuring. You can measure sensitivity in a number of ways, but sensitivity analysis is a whole academic field (see Wikipedia - https://lnkd.in/grVuAtcv - and the python package SALib - https://lnkd.in/gTT_2hqe - there's differing levels of sophistication from a visual aid of sensitivity to sophisticated algorithms) - this can be read more about in the Decision Intelligence Handbook. What this means is that we can prioritize data value (and data quality) to the sensitivity of the data derived from a metric that's part of a decision element to the key outcome it's associated with. Some food for thought for the data quality field here in how we can couple these together. #dataquality #metrictree #metrics #kpi #decisionintelligence #causalinference #cdd
Introduction to Metric Trees
hellotrace.io
To view or add a comment, sign in
-
Actual hack that works: Map your business domains, then build your model Every time I start an analytics engineering project, I map it to one of 5 domains: 1. Value Creation 2. Marketing 3. Sales 4. Value Delivery 5. Finance Outcomes: • 20% improvement in decision accuracy • 40% reduction in analytics budget • 30% less firefighting AI can write SQL. You're the one that can align people, processes, and technology with your business. 𝗣𝗦 If you are interested in learning more about the how, comment below.
To view or add a comment, sign in