Step outside the lab: iGaming UX research for real players
Just like any scientific work, research only matters if it holds up in front of real players. In this second piece from Pepper Partners’ cyberpsychology R&D division, founder and CEO Lev Polonuer walks us through how its eye-tracking experiments translate outside the lab and what that means for the wider learnings in this series.
In a recent article, we looked at how players around the world display different attention patterns and how our lab findings could help affiliates localise their UX design. But outside a controlled environment, without researchers watching every move, one might ask whether players behave the same way.
As one of the early adopters of cyberpsychology in iGaming, we often find ourselves fielding questions from clients who want proof and scientific rigour. It’s a normal part of introducing new research methods, and we become used to explaining how our approach works in practice. Today, our lab has grown into a network of scientific research hubs equipped with eye-trackers, heart-rate sensors, EEG systems and other tools. We actively invite collaborations with partners who value deep behavioural analysis and demand evidence-based UX.
What questions do you hear most often?
We take feedback seriously. The main question we get is how our data links to commonly used KPIs in a modern digital business. Clients who are used to traditional BI metrics like conversion or LTV want to know whether emotional response, attention and cognitive load can really predict revenue. We show how to validate these insights through A/B tests and model the correlations ourselves for our clients.
Clients who are used to traditional BI metrics like conversion or LTV want to know whether emotional response, attention and cognitive load can really predict revenue
Lev Polonuer, Pepper Partners CEO
Another recurring concern is more ethical – people want to know where optimisation ends and manipulation begins. The iGaming industry is heavily regulated, and any talk of subconscious influence can raise eyebrows. We are firm in our stance: cyberpsychology should simplify UX, reduce friction, create transparency and support responsible play, not exploit vulnerable players.
That said, some questions need deeper study and even standalone research projects to address.
Do you only run studies commercially?
Not at all. We also conduct many internal studies to validate methods and tools already used in the field. A good example came up two years ago, when a potential client asked whether behaviour in the lab truly reflects how people play slots in a more “natural environment” like home. At first, we tried to answer from existing literature, then we reviewed past case studies, including those that yielded successful commercial results. But the problem was that all the changes we recommended were part of global platform rebranding, making it impossible to isolate outcomes.
We didn’t see the client’s question as a hurdle but as a fair chance to prove our competence. The Hawthorne effect has been a known concept in experimental psychology for around a century – the idea that people change their behaviour simply because they’re being observed. The discovery was accidental: researchers were testing workplace lighting and found productivity increased when lights got brighter, increased again when they were dimmed and, surprisingly, rose once more when lighting returned to its original state. It made it clear that observation alone can affect behaviour.
The Hawthorne effect has been a known concept in experimental psychology for around a century – the idea that people change their behaviour simply because they’re being observed
Lev Polonuer, Pepper Partners CEO
How did you conduct the study?
We carried out the research using a Canadian sample, though nationality wasn’t central to the aim. We set up a between-groups experiment in which participants were randomly assigned to control and experimental groups. The control group, consisting of 172 participants, played slots at home using a special mini‑app we preinstalled without observation. The experimental group of 154 participants played in lab conditions using eye-tracking following equipment calibration. Those requiring lengthy calibrations or unreliable data were excluded from the final sample.
Each of the 326 participants received a $12 balance with a minimum $0.20 bet and played three classic slot games of ten rounds each – two three-reel and one five-reel. We chose different games to spot any potential variation.
Does the Hawthorne effect really influence iGaming experiments?
After thorough data cleaning, we did not find any significant difference in outcomes between the two groups. Players tracked with eye-tracking behaved almost the same as those playing at home, showing a similar level of risk-taking, with variance not exceeding 2.5%. This figure reflects several combined indicators, including decision speed when choosing bet size, the frequency of respins or restarts and the amount of money a user was willing to stake per round.
Average bet size remained stable, and so were patterns of player “generosity” across all three sessions. Mean bets were $0.413 in the control group and $0.417 in the experimental group, while large bets were placed at the start of the session in both groups. The differences in game design did not lead to noticeable behavioural change. Overall, this suggests that experimental conditions do not influence player behaviour: whether someone plays at home or under observation in a lab, their strategy, pace and appetite for risk remain essentially the same.
Furthermore, we compared the control group to participants who required multiple calibration attempts. As expected, calibration quality strongly correlated with setup time. During the experiment, 31 participants struggled with calibration and required more than two attempts. This not only reduced data accuracy but also affected behaviours, as those players showed more cautious decision-making, demonstrating a 14.5% difference in aggregated risk behaviour.
This study shows that because incentives in gambling are so different, observation effects and other impacts often fade
Lev Polonuer, Pepper Partners CEO
What does this mean for our experiments?
Gambling holds attention through uncertainty, reward anticipation, emotional highs and lows and losses. It is both commercially and scientifically interesting to study whether engagement patterns remain similar between platforms.
This study shows that because incentives in gambling are so different, observation effects and other impacts often fade. This means assumptions about player behaviour need to be tested in controlled yet natural settings to separate general psychology from what is unique to iGaming.
It also supported our earlier findings, including how gaze patterns among African users differ from those of Asian and European players. More practical UX takeaways for affiliates will follow in the next articles.
About Pepper Partners’ Cyberpsychology Lab
Founded by Lev Polonuer in 2020, Pepper Partners is a global CPA affiliate network specialising in the casino and sports betting sectors. Its Cyberpsychology Lab partners with leading universities, bringing together cognitive psychologists, psychophysiology experts, behavioural economists and MBA graduates who focus on complex digital interactions. Equipped with eye-trackers, heart rate monitors and electroencephalography (EEG) sensors, the division aims to understand users’ reactions on a neurophysiological level and offers UI recommendations for its partners. The Lab would love to hear from casinos, betting platforms, slot developers and other stakeholders interested in finding out more about future projects and potential collaborations.
Get in touch: support@pepper.partners.