RESOURCE

Testing different eye trackers in the wild

POsted by
Ellie Willis
Published
24th February 2026

At Cineon, our goal is to build training systems that understand human cognition and adapt to it in real time. We do this through eye tracking – using gaze behaviour to infer cognitive state during training.

But generating high-quality data alone isn’t enough.

For cognitive state monitoring to be truly useful in the real world, it must also be usable, feasible, and compatible with live training environments – particularly in safety-critical sectors like aviation.

That’s why real-world testing matters.

Why being hardware agnostic is important

In controlled environments, such as fully virtual reality simulations, it’s relatively straightforward to capture precise gaze data and map it to digital objects. These settings allow us to test our cognitive models under ideal conditions. But, a lot of training doesn’t happen in ideal conditions.

It happens in dynamic, hands-on and complex environments – such as a Boeing 737 cockpit – where physical controls, haptic feedback, and natural pilot behaviour are essential to training. In these contexts, VR alone isn’t always feasible or desirable. We recognise that different training tasks require varying levels of simulation to be effective. Accessibility needs, human factors, and task complexity all influence the appropriate level of immersion, and therefore the choice of hardware we use.

Over the years, we have tested a wide spectrum of extended reality and supporting technologies. Mixed Reality, smart glasses, and environmental eye trackers enable us to capture meaningful data from real-world tasks and simulations without disrupting training flow or becoming overly invasive. We evaluate a wide range of eye-tracking technologies in situ, under real operational conditions. Rather than designing our systems around a single device (such as VR), we examine how different platforms perform in terms of accuracy, robustness, usability, and practical constraints. This allows us to adapt our algorithms accordingly.

This approach allows Cineon’s cognitive monitoring tools to remain flexible and deployable across most platforms and environments. We are not tied to one headset, one camera, or one ecosystem. Instead, we design systems that work wherever or however training happens.

With thanks to JetSim

This work would not be possible without the generous support of Matt Briggs at JetSim.

JetSim is a flight simulator experience centre based at the Future Skills Centre at Exeter Airport, offering fully immersive Boeing 737 and Airbus A320 simulation environments that replicate real-world flying conditions.

Over the past couple of years, Matt has opened up JetSim’s simulators to Cineon, giving us access to realistic aviation training environments. His flying expertise has also been invaluable in helping us interpret results, validate assumptions, and evaluate our systems in conditions that genuinely reflect real-world training.

Access to these simulators has allowed our Labs team, led by Tilly, to evaluate multiple eye-tracking approaches directly inside the cockpit.

Three technologies tested in the cockpit

With JetSim’s support, we’ve recently tested a range of technologies in live flight simulation environments. Here’s what Tilly has found:

Mixed Reality – Varjo XR-4 FE

Varjo’s passthrough quality is exceptional, providing pilots with an almost exact view of the external environment. After minimal familiarisation, training can continue largely naturalistically.

The eye-tracking data is extremely high-resolution. By combining hand tracking, 3D cockpit models, and QR markers, we can precisely identify instrument interaction and contextualise gaze behaviour.

The trade-off: it requires a powerful PC and is less portable than other solutions.

Smart Glasses – Pupil Labs Neon

Neon glasses are highly portable and practical. They connect to a smartphone that can sit in a pocket, move naturally with head motion, and maintain reliable tracking.

They deliver exceptional data quality and include video for playback and post-session analysis. While hand tracking isn’t available and instrument labelling requires more strenuous processing (like computer vision), their portability makes them incredibly valuable – especially in environments where flexibility is critical.

Environmental Trackers – SmartEye Microbird

SmartEye is leading the way in completely non-intrusive environmental eye tracking.

The Microbird camera can be discreetly mounted inside a cockpit to capture gaze and facial data with minimal impact on the trainee. Custom cockpit models allow us to map gaze to specific instruments. For full coverage of the cockpit, multiple cameras will be required – but the result is one of the least intrusive monitoring solutions available.

Why real-world testing matters

Across all of this work, the common thread is adaptability.

By testing different technologies in real aviation environments, we gain a clear understanding of how cognitive monitoring performs under realistic constraints – limited space, natural movement, physical controls, and operational pressure.

That insight allows Cineon to design systems that:

  • Work across multiple platforms and devices
  • Adapt to varying data quality and hardware limitations
  • Integrate into real training environments without disruption
  • Reflect operational reality rather than lab assumptions

Access to technologies like these – and to environments like JetSim’s simulators – enables Cineon to build cognitive monitoring tools that are genuinely usable in the real world.

Because training systems only become transformative when they work where it matters most: inside real cockpits, with real trainees, under real conditions.

Looking to level up training, performance, or human-aware intelligence? Let’s explore how Cineon can help