How do we create a competitive edge for our market research and usability projects?
Most of us now use AI in desk-based research. There's a lot of data to be mined, and AI helps us leverage that. Market researchers have an edge when using AI because we've learned how to ask and frame questions.
We're inquisitive and love digging for insights. The datasets and tools are continually improving, helping marketers and companies make better-informed, customer-centric decisions quickly. Everybody has access to AI generated insight; my competitive edge is real behavior.
Asking the right questions
I didn’t need to learn how to use ChatGPT. Asking good questions to truly understand something is what my career has been all about. I’m also good at recognizing poor answers and know when to move on if an answer isn’t available or a question is invalid.
If everyone is using the same AI tools and datasets, the insights quickly become repetitive and predictable lacking the originality needed to truly innovate. You see this in AI-generated packaging or ad designs, where outputs follow familiar patterns and design cues, resulting in bland, uninspired work. That’s why observing real human behavior is so valuable. It brings nuance, context, and unexpected insights that algorithmic data simply can’t replicate, giving us a genuine edge in market research and usability projects.
There are two key reasons why attention analysis is a powerful tool for delivering real value to stakeholders:
Better insights
Visible bias
Better insights
I’ve used eye trackers in qualitative research for nearly 20 years, and it consistently delivers reliable insights that drive confident decisions. What makes it so powerful is how naturally it captures behavior — people make quick decisions and often can’t explain how or why. Reviewing footage with participants reveals those moments, like when someone gets distracted by a design element or how a brand loyalist navigates differently than someone unfamiliar with the brand. These subtle, real-time reactions offer depth that synthetic methods often miss.
In simple terms, I have always found more valuable insights through market research participants using wearable eye trackers and letting them complete a task/mission naturally, than any other research technique. Retrospective sessions also let me understand how they felt at the time.
Our world is moving faster, and human behavior is evolving just as quickly. Observing real people in action gives us fresh, relevant insights that AI often misses. Unlike static datasets, which lag behind current trends, and AI-generated content that risks feeding back into itself, real-world observation keeps us close to emotionally and mentally driven decisions — where true innovation lives.
Compared to other research techniques, I’ve found retrospective sessions in qualitative attention studies to be more reliable at delivering new ideas for innovation, more ideas in detail, more ideas in context, and more 'free prizes' as
Seth Godin would say. Listening to a participant give commentary on their visual behavior footage has always delivered some of the most valuable insights for me. We record natural behavior without interruption and then follow with a deep dive into
"The why".
Visible bias
Understanding and mitigating bias in market research is critical for ensuring reliable insights and sound decision-making. Misleading influences can creep in through poorly worded questions, non-representative samples, or subconscious assumptions made by researchers. We work hard to minimize these distortions in projects — it’s a key part of conducting reliable and meaningful research.
AI learns from historical data, and while it's powerful, those datasets come with gaps and biases. The more we try to refine or segment artificial insights, the more we risk reinforcing underlying distortions — creating a loop that’s hard to break. That’s why I believe in working with AI on our terms. It’s a useful tool, but real human behavior keeps us grounded, ensuring our insights remain relevant, nuanced, and emotionally connected.
To have confidence in artificial insights, you ideally need to triangulate them with real insights.
Why context still beats algorithms
We know that task and context drive behavior. The same people can behave differently when given the same task in different contexts or the same context with different tasks. Whenever reviewing attention data, it’s imperative to understand what the person was trying to do.
We’ve got a long way to go before AI can simulate gaze patterns and emotions based on different contexts and tasks. And even when it can, will I be able to get useful insights by asking synthetic behavior about why?
Go beyond the synthetic
Inquisitive me will always prefer conducting research with real people. The unknown is fascinating, and behavior is usually easier to observe than predict.
While synthetic data and amazing advancements in AI can greatly help expedite many aspects of research there simply is no substitute for capturing real, in the moment, in context, human behavior. And eye tracking in its various guises is one of the powerful and obvious tools to achieve this.
Dive deeper into real human behavior
The search for ‘why?’
Download the guide to find out how eye tracking reveals fundamental insights into your consumer's experience as well as the truths behind shopper reasoning.
The limits of predictive eye tracking in shopper marketing
This article explores the scientific limitations of predictive eye tracking tools and argues why combining real eye tracking with psychological analysis offers superior insights for measuring in-store displays, packaging, and other shopper marketing interventions.
Empowering UX design with attention data
In this sample report, you’ll see data and insights from a range of UX studies focusing on mobile app design, web design, prototype testing, and in-store navigation.
Subscribe to our blog
Subscribe to our stories about how people are using eye tracking and attention computing.