Yesterday, we introduced AI-automated thematic analysis. 👀 But what exactly is thematic analysis? Learn how user researchers apply it in this primer: https://lnkd.in/dpRuQRtd
Wondering
Software Development
London, England 1,410 followers
Unlock continuous user insights at scale with AI-powered user research.
About us
The AI-led customer research platform that helps you understand your customers.
- Website
-
https://www.wondering.com/
External link for Wondering
- Industry
- Software Development
- Company size
- 11-50 employees
- Headquarters
- London, England
- Type
- Privately Held
- Founded
- 2020
Locations
-
Primary
86 Paul Street
London, England, GB
Employees at Wondering
Updates
-
Say hello to AI-automated thematic analysis 🗂️ Wondering's AI analysis now automatically codes and conducts thematic analyses of responses from your AI-moderated user research studies. Transform raw participant responses into actionable customer insights within minutes of completing your study.
Today, we’re introducing the next step in our journey toward AI tools that help customer-centric teams automatically collect user insights at scale: AI-automated thematic analysis. Over the past year, we’ve worked with many of you to understand how you use our AI analysis to speed up decision-making. We’ve also listened to the parts of the analysis process you can’t yet automate with Wondering’s AI, such as thematic analysis. As of today, Wondering’s AI automatically codes response data from your AI-moderated user interviews, prototype tests, live website tests, and surveys thematically. Here’s how it works: 🗂️ Automatic coding of response data: Our AI identifies and highlights parts of your response data (in 50+ languages!) with thematic codes. This breaks the data into small, individual pieces of evidence that can then be further analyzed to uncover meaningful themes and insights. 🗺️ Codes organized into clusters: Once we’ve coded your response data, we group the evidence into clusters based on patterns identified by our AI. This allows you to quickly spot common trends, such as mentions of specific topics, feature requests, and more. 💡 Generating high-level findings: Finally, the AI analyzes these clusters alongside your study brief and generates high-level findings. Each finding includes a concise summary, a description, and supporting quotes that serve as evidence. AI holds enormous potential to help research teams uncover customer insights and make faster, data-driven recommendations. We’re really excited for you to try it out!
Introducing AI-powered thematic analysis | Dec 17, 2024
wondering.com
-
Ever feel overwhelmed when choosing the right tools for prototype testing? We've got you covered with a guide to five popular tools to help you test your prototypes with real users. Here are the five prototype testing tools you should consider stepping into 2025: https://lnkd.in/d4J3_FxM
5 tools for prototype testing | Dec 03, 2024
wondering.com
-
As November draws to a close, our team has been hard at work preparing for the release of the first version of our fully AI-automated thematic analysis for qualitative customer research. But don’t miss these eight other product updates that went live in your account this month 👀
As many of you already know, we’re heads down building a new version of AI-run thematic analysis, which will make it easier for you to quickly analyze the responses in your Wondering studies. We’re getting close to a first release, and you’ll get access to it over the next few weeks. Building up to this release, we’ve also shipped a heap of product updates to make it easier for you to get the insights you’re after. Here are some other updates you’ll now be able to see in your Wondering account: 🖥️ Live Website Testing: Earlier this month, we shipped Live Website Testing! Live Website Tests allow you to show your participants any website, such as your landing pages or a competitor website. By combining Live Website Tests with other blocks in your study, you can then ask questions to your participants to better understand how they experience those websites. 🤖 Added support for Live Website Testing in the AI Study Builder: You can now generate Live Website Testing studies using the Wondering AI Study Builder. 👀 See all the responses for each block: We added block-level response tables that show responses to each question within a block, including answers to follow-up questions: 🔌 See Figma events in the participant transcript: To make it easier to understand how participants interacted with your Figma prototypes, you can now see Figma events in the transcripts for Prototype Test blocks. 💾 Export prototype test data in your CSV exports: You can now include prototype test data in your CSV exports, making it easier to dive deep into your results. 💖 A smoother participant experience: Sometimes it’s the smaller updates that are the most exciting. We’ve made numerous small improvements to the experience navigating the Wondering researcher app. ❤️ Usability improvements for researchers: We’ve made some improvements to the UI of the participants' experience when participating in studies, based on feedback from participants on our panel. 🐛 Platform-wide stability improvements and bug fixes: We’ve made numerous under-the-hood improvements to enhance the stability and reliability of the platform, ensuring a smoother experience all around. Anything else you'd like to see in Wondering? Let me know!
-
Live Website Testing is here! 💻 👀 Test any website, such as your landing pages or a competitor website, with real users to learn how to improve the conversion, comprehension and usability of your website. All with real users.
You can now test any website with real users on Wondering! Since we first release Wondering's AI user test moderation and analysis, we've been on a mission to bring AI-driven user research to every team we work with. Today, we're thrilled to announce the launch of our new AI-powered Live Website Testing feature in public beta. Live Website Tests allow you to show your participants any website, such as your landing pages or a competitor website. Combining Live Website Tests with other blocks in your study, you can then ask questions to your participants to better understand how they experience those websites. User testing just got faster. Check it out: https://lnkd.in/eRTJQ-SG
Introducing Live Website Testing | Nov 18, 2024
wondering.com
-
Wondering reposted this
If you're advocating for AI-powered research tools within your organization, how can you best communicate the strategic impact of investing in them? Here are three strategic outcomes that AI-powered research tools can unlock for your research practice: https://lnkd.in/ezjD6YkF
The strategic advantage of AI-powered user research | Nov 12, 2024
wondering.com
-
Our new response views are out! You can now see all the responses from your participants in easy to navigate data tables, and quickly drill deeper into the participant responses that stand out to you.
Today, we’re launching easier ways to view participant responses in Wondering: introducing new participant response views and data tables. 📊 As more of you launch AI-moderated studies and dive into the responses you collect, we recognized that navigating and comparing responses could be smoother. So we listened, took notes, and made changes to make your data analysis faster and more intuitive. Here’s what’s new: https://lnkd.in/eVCKRnHh
Meet the new participant response views | Oct 29, 2024
wondering.com
-
Trying to convince your stakeholders to invest in AI-powered customer research? Showing them hard data is key. If you need to get your team on board, here are eight stats that will help make your case for testing out AI-powered research methods:https://lnkd.in/ek89XmRZ
8 stats about AI-powered user research to get your stakeholders on board | Oct 29, 2024
wondering.com
-
Wondering reposted this
Curious about AI-led user testing? We caught up with the KatKin team to see how they’re using Wondering to scale their user research. Here’s what we learned: https://lnkd.in/e8MdHiU7
How KatKin uses Wondering to build better user experiences | Oct 21, 2024
wondering.com
-
Wondering reposted this
Last week we announced AI-powered prototype testing into public beta. We've loved hearing how you’re using it to test your designs and get answers to your burning usability testing questions faster than ever. But, as always, we’re not stopping there. This week we've shipped 11 more product updates to make AI-moderated research easier: https://lnkd.in/dhPiVZMf
11 more product updates | Oct 16, 2024
wondering.com