- Links for Thinks
- Posts
- đź”— đź§ #20 Testing the Right Things
đź”— đź§ #20 Testing the Right Things
Understanding when and how to test your designs

Five resources every week with actionable takeaways to make you a better designer.
Woof — what a whirlwind of a month. I didn’t intend to have such a huge lapse in content, but here we are. Thanks for bearing with me.
Ever been stuck in an endless cycle of “I like that” or “I don’t like that” feedback? Not sure how to back up your decisions or chart a clearer path forward? Well maybe you could use a little help from our friend validation.
Often when we talk about testing, we lean into A/B testing. We think we test two different things and the one that “performs” better must be the solution, right? Maybe not.
Leaning too heavily into only relying on A/B testing can be a bit of a double edged sword. Design is often dealing with a more complex set of decisions rather than taking a single action on a single page.
So you may be asking yourself “Well, what else can I do to test my designs?” Glad you asked — let’s take a look at how you can expand your design testing horizons.
— Jake
TODAY'S EDITION

THINKING ABOUT TESTING
Before we dive deeper into specific areas of design, let's make sure we're aligned on the basics. Design Thinking. We all know it. We all love it. Testing isn't just about validating your final design — it's a crucial stage that feeds back into every other part of the design thinking process. And the reality is you should be testing throughout the entire process, not just at the end.
THE JUICE
Testing Is Learning: The test phase isn't all pass/fail — it's about uncovering opportunities. Every problem discovered is actually a gift — valuable information that guides your next iteration. Success generally means learning something new, not getting everything right on the first try.
Test Early and Often: Don't wait until your design is polished to start testing. Testing rough prototypes early in the process is less time-consuming, less expensive, and gives you crucial feedback when you can still make significant changes.
The Feedback Loop: Testing isn't the end — it's part of an iterative cycle. User feedback often sends you back to empathize, redefine, ideate, or prototype again (or whatever your framework of choice is). Each loop refines your solution and brings you closer to solving the right problem.
Implementation Matters: Beyond usability, look at how your solution fits into people's lives. A design that passes usability tests might still fail if it doesn't integrate well with users' broader context and needs.
Observe More, Ask Less: What users do is more revealing than what they say. Focus on behavior — where they hesitate, what confuses them, where they succeed or fail — rather than direct opinions about your design.
Testing Isn't Just for Users: Include stakeholders in your testing process. Their buy-in is crucial for implementation, and watching real users interact with your design can transform their understanding of user needs.
The Right Mindset: Approach testing with curiosity, not defensiveness. The goal is to improve your solution, not defend your current design. Be ready to revisit earlier stages if testing reveals deeper issues.

WORK SMARTER NOT HARDER
So you've got a great new idea for a product feature — but do people actually want it? Concept testing helps you answer that question before investing time and resources into development. It's about validating the core value proposition with your target audience early in the process, when changes are still relatively easy and inexpensive to make.
THE JUICE
Fail Fast, Save Resources: Testing concepts early helps you identify non-starters before you invest in detailed design and development. Each hour spent on concept testing could save days or weeks of wasted effort later.
Types of Concept Tests: Different approaches serve different needs:
Monadic testing: testing one concept in isolation
Sequential monadic: testing multiple concepts one after another
Comparative testing: directly comparing alternatives
Protomonadic testing: deep dive on one concept followed by brief exposure to alternatives
The When and Where: Timing matters in concept testing. Early-stage testing validates the basic idea, while later tests can refine specific implementations. Remote testing gives you scale and diversity, while in-person testing offers depth and nuance.
Beyond Like/Dislike: Don't just ask if folks like your concept — dig deeper into whether they understand it, whether it solves their actual problems, and whether they'd choose it over existing alternatives.
Quality Questions = Quality Data: Focus on asking about specific aspects of the concept rather than vague overall impressions. "Would you use this?" is less useful than "How would this fit into your current workflow?"
Look for Patterns, Not Validation: The goal isn't to confirm what you already believe — it's to identify patterns that reveal underlying needs and pain points. Sometimes the most valuable insights come from concepts that fail.
From Insights to Action: The best concept testing doesn't just tell you what works — it tells you why. These insights can inform not just the current concept but future design decisions as well.

BEYOND “DO YOU LIKE IT”
Visual design isn't just about aesthetics (though still important) — it shapes how users perceive and interact with your product. But when it comes to testing visual design, it still gets reduced to subjective opinions about what looks or feels "good." There's a better way: focus on how visual design impacts user understanding, expectations, and behavior.
THE JUICE
Attitudinal vs. Behavioral: Choose your testing method based on what you want to learn. Attitudinal methods gather thoughts and feelings about aesthetics and brand alignment. Behavioral methods like A/B testing and eyetracking show how design elements influence actual user behavior and task completion.
The Five-Second Rule: 5-second tests capture gut reactions to visual style before people can read copy or notice details. It's perfect for testing first impressions and overall visual impact, but don't warn participants about the time limit — you want authentic reactions.
First-Click Testing: When users have specific goals, first-click tests reveal if they can quickly find what they need. Most people spend only a few seconds scanning for actionable elements, making this ideal for testing visual hierarchy and navigation placement.
Preference With Purpose: When comparing design variations, make sure differences are significant enough for non-designers to detect. Change one element at a time (like layout while keeping colors consistent) to draw clear conclusions about what impacts users most.
The Right Questions: Move beyond "do you like it?" to structured approaches:
Open-ended explanations for discovering what matters to users
Open word choice for specific but unconstrained feedback
Closed word choice for measuring brand attribute alignment
Numerical ratings for quantifying specific qualities
Beyond First Impressions: Visual design impacts long-term user behavior. Use eyetracking to see what draws attention and A/B testing to measure how design choices affect conversion rates and task success over time.

ALL CURIOUS RESEARCHERS STOP AND ANALYZE CAREFULLY
No single data point tells the whole story in usability testing. When 4 out of 5 users say they like a feature, but only one actually used it (and struggled), you're looking at misleading surface-level signals. Each piece of data — whether behavioral or attitudinal — needs to be examined through six critical lenses to separate real insights from noise.
THE JUICE
The Six Dimensions: Use this fun lil mnemonic to remember the dimensions: All Curious Researchers Stop and Analyze Carefully.
Authenticity: How natural was the comment or behavior? Was the participant trying to please you, or did they genuinely mean what they said? Pay attention to how something is said, not just what is said.
Consistency: Does this data point align with other feedback from the same participant? When someone says a task was easy but their behavior shows multiple restarts and errors, trust the behavior over the words.
Repetition: How often does this comment or behavior occur across sessions or participants? Repeated patterns reveal underlying mental models and real user tendencies.
Spontaneity: Did the feedback come naturally, or only after you prompted for it? Spontaneous reactions are generally more authentic than responses to direct questions.
Appropriateness: Does this feedback relate to your research questions, or is it just interesting but irrelevant noise? Stay focused on what you actually need to learn.
Confounds: What other factors might have influenced this data point? Consider study design, participant characteristics, and environmental factors that could skew results.
The Reality Check: When behavioral and verbal data contradict each other, prioritize what people do over what they say. Actions reveal true user experience more accurately than self-reported opinions.
Critical Context: Every data point exists within the context of your study design, participant recruitment, and research goals. Apply these six dimensions systematically to avoid being misled by surface signals.

TEST CONCEPTS, NOT JUST WORDS
Content testing isn't just about finding typos or checking if your copy sounds good. It's about understanding if people actually get and act on the information you're presenting. Are they getting the right message? Do they even know what you're asking them to do? Sometimes seemingly perfect written content completely misses the mark.
THE JUICE
Banana Testing: Replace all key actions with the word "Banana," then ask folks to suggest what each action on the page could prompt. This tells you things like whether your icons are helpful, if interactive elements are perceived as clickable, and if actions are in the right places. It can uncover what words might be useful to double down on context. It sounds weird, but it might just work.
Content Heatmapping: Give participants a task, then ask them to highlight things that are clear or confusing using different colors. Map all highlights into a heatmap to identify patterns.
Run Moderated Sessions: People might say a page is "clear and well-organized," but when you ask specific questions, they might also really have no idea what’s going on. These insights rarely surface in unmoderated sessions — you need to observe behavior and ask follow-up questions on the spot.
Test Concepts, Not Words: Don't just tweak individual words — test broader concepts and flows. Avoid speaking your content aloud during testing since that's not how people normally consume it. Ask questions and wait silently for genuine reactions.
Choose Your Method: Match your testing approach to what you want to learn:
Do users understand? → Interviews, highlighting, Cloze tests
Do we match their mental model? → Banana testing, Cloze tests
What word works best? → Card sorting, A/B testing
Why doesn't it work? → Interviews, highlighting, walkthroughs
Do we know user needs? → Competitive testing, process mapping
Beyond Individual Words: Content isn't just copy — it's voice, tone, and the entire communication experience. Test how people perceive your end-to-end experience, not just isolated pieces of text.
THANKS FOR READING—SEE YOU NEXT WEEK
In the meantime, feel free to:
Forward this email or share this link with your friends if you feel they need some links for thinks: https://www.linksforthinks.com/subscribe
Reach out with any suggestions or questions for the newsletter.
Send me some of your favorite links and takeaways.
Cheers, Jake