Marker Tracking & Eye Tracking

Laboratory experiments in psychology are often criticized for being far from reality. To achieve higher ecological validity, mobile research equipment can be used. Eye tracking glasses, for example, allow us to study gaze behavior in natural situations. The main problem with using such devices is mapping gaze coordinates onto actual stimuli (or a reference image of a stimulus). To do this manually is possible but very time consuming, given the amount of data. There are, however, some options to automate the process, namely, object recognition and marker tracking.

Object recognition is alright, until participants start moving freely and stimuli get more complicated. Marker tracking is more reliable as it uses fixed “anchors” on the stimulus surface and maps gaze data onto a reference image based on the positions of these anchors.

Typical surface-tracking markers, like ArUco markers (Figure 1a), require good quality video to be identified. This creates a limitation on the usage of such markers: users need to move slowly and maintain close distance from the stimulus.

For this reason, we at SR LABS Srl developed a more reliable marker tracking system that works even when users move freely and approach the stimulus from long distances.

Our system uses simple colored markers (Figure 1b) which can be identified even in low-quality videos, typical with commercially available eye tracking glasses. They can also be tracked from up to 5 meters (depending on markers size).

Examples of typical surface tracking ArUco markers (a) and markers used by our system (b)

The system can recognize stimulus’ distance from the user, the orientation of the stimulus plane and the position of the stimulus in each frame of the recorded video. Then it uses this information to map gaze data onto a reference image. The mapping procedure is fully automatic and does not require manual intervention, making the system ideal for studies with large sample sizes.

Eventually, after the system is optimized to work in real time, it can be used for human-computer interaction when remote eye trackers cannot be employed.

If you are interested in the system and would like more information please feel free to contact me.

Also, if you happen to be at ETRA next month, make sure to pass by our stand during the Demo session; I will present the system and show example applications and output.

Iyad Aldaqre

Data Scientist at SR LABS Srl, Milan