Niclas got me thinking about knowledge sharing within the analytics context. The obvious angle is to organize some form of show and tell, (e.g. daily standup) where you can show how you did something, how to use a technology, or explain conclusions. Maybe the collective memory of the team will remember when this info becomes relevant, likely not. But I wonder - how does one convey the thought process that went into investigation, pattern recognition, applying business context to the data question at hand. After all, analytics is closer to detective work than anything else. Detectives string together many bits and pieces together, both newly dug up info as well as notes from old case files. Pretty sure in analytics we do something similar. - Talk to a stakeholder, find out there's another person to talk to - Stalk how current business processes are done in real life - Remember to exclude that data anomaly from 1.5 years where revenue got counted twice - Hunt down that one query you used few months ago that could repurposed Before the summer we spoke to a ton of analysts and a good majority cited that most time went into the context and investigation process. Mangling data close second, but the actual analysis part was quote - the easiest. So, if the majority of time is spent on the "WTF" phase, and every analyst goes through a personal, daily "WTF" phase, wouldn't it make sense to focus on the skills needed to get out of WTF faster? Less mistakes, more thorough work, faster turnaround time. Some of the brightest analysts I've met have a knack, natural or trained, for this. But - seems that currently the only real way to acquire this type of knowledge is with hands-on experience and time. -- #dataanalytics #analytics
Richard Makara’s Post
More Relevant Posts
-
Theory vs. reality can be brutal. In this post, Tim Wilson outlines the *theory* of where reporting and analytics fits into an organization's processes—a "smoothly symbiotic virtuous cycle"—and contrasts that with *reality*: a "frenetically spiraling vicious cycle" of data requests and expanding dashboards and bloated slide decks that increase the blood pressure of all parties involved rather than increasing business impact. https://lnkd.in/dUezPt_k #reporting #analysis #analytics #dashboards #tyrannyofmore
To view or add a comment, sign in
-
Many organizations have beautiful process documentation that outlines how a business stakeholder makes requests of the data teams complete with SLAs and RACIs. But how often does that virtuous cycle come to life in reality? Who feels like it's more of a vicious cycle that doesn't leave anyone feeling fulfilled? This is exactly what Tim Wilson dives into in this most recent post from The Focus!
Theory vs. reality can be brutal. In this post, Tim Wilson outlines the *theory* of where reporting and analytics fits into an organization's processes—a "smoothly symbiotic virtuous cycle"—and contrasts that with *reality*: a "frenetically spiraling vicious cycle" of data requests and expanding dashboards and bloated slide decks that increase the blood pressure of all parties involved rather than increasing business impact. https://lnkd.in/dUezPt_k #reporting #analysis #analytics #dashboards #tyrannyofmore
The Tyranny of More (Reporting and Analysis)
thefocus.factsandfeelings.io
To view or add a comment, sign in
-
The best product teams don't chase the Feature Squirrel 🐿 They determine what's critical and focus with extreme discipline. If analytics is treated as a product, Analysts need to avoid chasing ad hoc data requests without understanding stakeholder needs and context. Here are questions you can ask - ➡️ What problem are you trying to solve, or what question are you trying to answer? ➡️ How will this data specifically help you solve that problem, or answer that question? ➡️ What would you do if you cannot get the requested data? ➡️ What are you currently doing to address the problem or answer the question? Asking these and other clarifying questions will help you understand the business context and enable you to prioritize requests based on impact. Happy analyzing! #dataanalytics #datascience #dataengineering
To view or add a comment, sign in
-
𝐓𝐡𝐞 𝐌𝐚𝐣𝐨𝐫 𝐌𝐢𝐬𝐭𝐚𝐤𝐞 𝐌𝐚𝐧𝐲 𝐃𝐚𝐭𝐚 𝐀𝐧𝐚𝐥𝐲𝐬𝐭𝐬 𝐌𝐚𝐤𝐞 (𝐀𝐧𝐝 𝐇𝐨𝐰 𝐭𝐨 𝐅𝐢𝐱 𝐈𝐭!) One of the most 𝗰𝗼𝗺𝗺𝗼𝗻 𝗯𝗹𝘂𝗻𝗱𝗲𝗿𝘀 that data analysts - and even aspiring ones - often fall into is focusing too much on the 𝘁𝗼𝗼𝗹𝘀 and 𝘁𝗲𝗰𝗵𝗻𝗶𝗾𝘂𝗲𝘀, and not enough on 𝘂𝗻𝗱𝗲𝗿𝘀𝘁𝗮𝗻𝗱𝗶𝗻𝗴 𝘁𝗵𝗲 𝗽𝗿𝗼𝗯𝗹𝗲𝗺 they’re trying to solve. 𝑻𝒉𝒆 𝑴𝒊𝒔𝒕𝒂𝒌𝒆: Diving straight into the data, running fancy algorithms, building complex models, or creating impressive dashboards 𝘸𝘪𝘵𝘩𝘰𝘶𝘵 clearly defining the business problem or question. This results in 𝗶𝗻𝘀𝗶𝗴𝗵𝘁𝘀 𝘁𝗵𝗮𝘁 𝗱𝗼𝗻’𝘁 𝗮𝗹𝗶𝗴𝗻 𝘄𝗶𝘁𝗵 𝗯𝘂𝘀𝗶𝗻𝗲𝘀𝘀 𝗴𝗼𝗮𝗹𝘀, and stakeholders left scratching their heads. 𝗧𝗵𝗲 𝗙𝗶𝘅: 𝗦𝘁𝗮𝗿𝘁 𝘄𝗶𝘁𝗵 𝗪𝗛𝗬 - always. Before jumping into any analysis, ask yourself: 1️⃣ What is the 𝗯𝘂𝘀𝗶𝗻𝗲𝘀𝘀 𝗾𝘂𝗲𝘀𝘁𝗶𝗼𝗻 we're trying to answer? 2️⃣ How will this data help solve the problem or inform decisions? 3️⃣ What 𝗮𝗰𝘁𝗶𝗼𝗻𝗮𝗯𝗹𝗲 𝗶𝗻𝘀𝗶𝗴𝗵𝘁𝘀 are needed from the analysis? The best data analysts are not those who know the most tools but those who can effectively bridge the gap between 𝗯𝘂𝘀𝗶𝗻𝗲𝘀𝘀 𝗻𝗲𝗲𝗱𝘀 and 𝗱𝗮𝘁𝗮 𝗶𝗻𝘀𝗶𝗴𝗵𝘁𝘀. Understanding the problem first ensures that every chart, model, and report adds real value. 𝑷𝒓𝒐 𝑻𝒊𝒑: Next time you sit down to analyze data, take a step back. Spend time talking to stakeholders, defining the business question, and then let that guide your analysis. 𝑰𝒕’𝒔 𝒏𝒐𝒕 𝒋𝒖𝒔𝒕 𝒂𝒃𝒐𝒖𝒕 𝒕𝒉𝒆 𝒅𝒂𝒕𝒂 - 𝒊𝒕’𝒔 𝒂𝒃𝒐𝒖𝒕 𝒕𝒉𝒆 𝒊𝒎𝒑𝒂𝒄𝒕. Tagging Venkata Naga Sai Kumar Bysani Hari Prasad Renganathan Priyanka SG Pradeep M Ayan Khan Shashank Singh 🇮🇳 Munna Das Raghavan P Thodupunuri Bharath AMAN KUMAR BALGEET SINGH Pratik Sonawane Kratika Jain Andrew C. Madson Avery Smith Alex Freberg Follow Vishnu Teja Dumpala for more. #DataAnalytics #BusinessProblem #InsightfulAnalysis #DataDrivenDecisions #CommonMistakes #BusinessIntelligence
To view or add a comment, sign in
-
There is nobody better than Tim Wilson at explaining—with equal amounts of love & frustration—how organizations misuse & misunderstand the power of analytics & measurement. Fortunately, there are also very few people better at designing the solution.
Theory vs. reality can be brutal. In this post, Tim Wilson outlines the *theory* of where reporting and analytics fits into an organization's processes—a "smoothly symbiotic virtuous cycle"—and contrasts that with *reality*: a "frenetically spiraling vicious cycle" of data requests and expanding dashboards and bloated slide decks that increase the blood pressure of all parties involved rather than increasing business impact. https://lnkd.in/dUezPt_k #reporting #analysis #analytics #dashboards #tyrannyofmore
The Tyranny of More (Reporting and Analysis)
thefocus.factsandfeelings.io
To view or add a comment, sign in
-
Ever wondered what makes data speak? 📊 It's not just numbers; it's stories waiting to be told. 2023 taught me something invaluable: Data isn't just about insights; it's the backbone of every successful strategy, the unseen hero in decision-making. But here's the twist—data alone isn't enough. It's the blend of data with real-world context that creates magic. 🌟 Here's my approach (and you might find it helpful): Listen Before You Look: Before diving into the data, understand the problem. What story are you trying to uncover? This perspective shift makes all the difference. Simplicity is Key: Ever heard of the KISS principle? "Keep It Simple, Stupid." That's my mantra. Simplify complex data into actionable insights. If a graph needs a manual to understand, it's too complex. Narrate, Don't Report: Data tells a story. It's not about reporting numbers; it's about narrating the story behind those numbers. What do they represent? How do they impact real life? Visualize for Impact: A well-crafted chart can speak louder than a thousand numbers. Invest time in visualization. It's not just about making data 'pretty' but making it resonate. Stay Curious: The best data analysts are perpetually curious. Ask "why" five times. With each "why," you'll uncover a deeper layer of insight. Data analysis is an art and science. It's about uncovering the stories hidden within numbers and using them to drive decisions that matter. 🚀 #data #analytics #dataanalyst #dataanalysis #storytellingwithdata #datadrivendecisions
To view or add a comment, sign in
-
It's a deeply entrenched phenomenon at many organizations: the optimization sees everything as a potential test, the analytics team sees everything as a potential analysis, and so on. Our natural instinct to "do stuff" makes it hard to step back and do the up front thought work that *should* come first, as Valerie Kroll explains in this article!
We’ve all heard the saying, “When all you have is a hammer, everything looks like a nail,” and Valerie Kroll would propose that it applies to the way too many organizations approach hypothesis validation. 🔨 To the CRO team, everything looks like something an A/B test can solve 🔧 To the Market Research team, everything looks like something consumer surveys can answer 🪚 To the Analytics Team, everything looks like something historical data analysis can get to the bottom of 🪛 To the Data Science Team, everything looks like something finding the right R package can untangle So what should we do about it? Read this post and broaden your toolset! https://lnkd.in/g5bZJTfH
The Data-Driven Toolbox is More Than Just a Hammer
thefocus.factsandfeelings.io
To view or add a comment, sign in
-
#F_Distribution: My Journey with Statistical Insights Have you ever dived into the world of statistical distributions? Today, let's unravel the power of the F distribution and its impact on data-driven decision-making. Join me on this exploration of statistical insights! 💡🔍 🔹 Embracing the F Distribution: Throughout my career as a data analyst, the F distribution has been a cornerstone of my analytical toolkit. It's not just about numbers; it's about uncovering meaningful patterns and understanding data variability. 📈 #FDistribution #DataAnalysis #StatisticalInsights 🔹 Key Components I've Learned: 1️⃣ Degrees of Freedom: Understanding the nuances of degrees of freedom has been pivotal in my statistical journey, allowing me to conduct robust hypothesis tests and draw accurate conclusions from data. 🌐 #DegreesOfFreedom #StatisticalJourney 2️⃣ F Statistic Insights: Calculating and interpreting the F statistic has enabled me to assess variance, identify significant differences, and optimize strategies based on data-driven insights. 📊 #FStatistic #DataInsights 🔹 Real-Life Applications: The F distribution has empowered me to: 📊 Conduct comprehensive Analysis of Variance (ANOVA), dissecting group differences and evaluating experimental outcomes with confidence. 📈 #ANOVA #ExperimentalAnalysis 🔍 Perform regression analyses, exploring predictive relationships and leveraging statistical models to drive actionable recommendations. 📉 #RegressionAnalysis #PredictiveAnalytics 🔧 Implement quality control measures, ensuring process consistency and product excellence through statistical quality assurance techniques. 🔍 #QualityControl #DataQuality 🔹 Empowering Data-Driven Decisions: By harnessing the F distribution, I've been able to: 🔎 Uncover hidden insights and trends within complex datasets, guiding strategic business decisions and research initiatives. 🌟 Drive innovation and problem-solving through evidence-based analysis and continuous improvement. How has the F distribution influenced your data analysis journey? Share your experiences and thoughts in the comments below! 💬 #FDistribution #StatisticalAnalysis #DataScience #DataInsights #DecisionMaking #StatisticalJourney
To view or add a comment, sign in
-
Last week we talked about identifying key metrics that matter to you, your team, and your customers. But how do you get the data to create reliable metrics? For the non-data specialists among us, it can seem like we need to call IT or Analytics immediately. But take a few minutes to organize your thoughts, and the conversation will be more effective -- or maybe you'll realize you already have what you need! First, define the metric you have in mind, describe it as vividly as possible, and list some examples. How frequently does it make sense to measure? Is it a point-in-time metric (a value "as of this date") or a rolling number (year over year, quarter over quarter change)? Why do you want do understand it? Make sure the "so what" is clear to you, so you can share it with others. With the desired insights clear, get as close as you can to describing the data sources for this metric. Is there a certain person who shares it with you, a report you've seen it in, or a transactional system you know that the information comes from? We'll need to find a predictable, reliable source for your key metric to get it on a regular basis. We just looked upstream in the data flow. Now look downstream: what other teams, reports, etc. use the information you're describing? Perhaps they're already getting the data you need, or even calculating the metric (or something similar) that you're seeking. Best case: you can use what they've already developed. If not, you will have more information and you may even have allies in your request for insights. Now you're ready to ask the Analytics team, IT, or another business group looking at similar concepts for help in securing your metric. Describe the metric you've identified, why it matters to you, where you think it comes from, and ask if they can help you gain access to the insight -- or help you take the next step toward it. We hope you'll be pleasantly surprised at how much more productive the conversation is with a few critical details prepared. If you're knee deep in data detective work and could use a helping hand, let us know.
To view or add a comment, sign in
-
The Importance of Cleaning Your Data 🧹 It happens every time. You’re diving into a new dataset, excited to uncover insights. You start your analysis, but something doesn’t look right. The numbers are off, the trends don’t make sense. Deep inhale And… You realize the data is messy. If you can relate to this, don’t worry. It happens to everyone. And because of that, it raises a crucial point. How often do you clean your data before analysis? Do you take the time to ensure accuracy, or do you dive in headfirst and hope for the best? I ask this because having a meticulous approach to data cleaning is VALUED like gold in the data analytics field. Why? Because most people don’t. They jump straight into analysis, and when the results don’t make sense... …well... ...they take a coffee break or blame the dataset or their tools. (They especially blame their tools if the data they are working on is messy). Don’t be like them. Clean your data thoroughly. Verify its accuracy. Don’t quit. You’re the solution. #DataCleaning #DataAnalysis #TechLife #ProductivityTips #DataDrivenDecisions
To view or add a comment, sign in
Ops & Data Nerd
4moYour speed to get out of the “WTF” phase is directly related to how much business context you have. Both context for the particular business you work in, and the domain you do analytics for (sales, product, etc). I similarly don’t think it’s easy to shortcut this. You need to learn this context on the job. This is why “entry level” data jobs don’t exist and is why breaking into a data role is so difficult.