Abstract: This paper presents the use of eye tracking data in Magnetic AngularRate Gravity

(MARG)-sensor based head orientation estimation. The approach presented here can be deployed

in any motion measurement that includes MARG and eye tracking sensors (e.g., rehabilitation

robotics or medical diagnostics). The challenge in these mostly indoor applications is the presence

of magnetic field disturbances at the location of the MARG-sensor. In this work, eye tracking data

(visual fixations) are used to enable zero orientation change updates in the MARG-sensor data fusion

chain. The approach is based on a MARG-sensor data fusion filter, an online visual fixation detection

algorithm as well as a dynamic angular rate threshold estimation for low latency and adaptive

head motion noise parameterization. In this work we use an adaptation of Madgwicks gradient

descent filter for MARG-sensor data fusion, but the approach could be used with any other data

fusion process. The presented approach does not rely on additional stationary or local environmental

references and is therefore self-contained. The proposed system is benchmarked against a Qualisys

motion capture system, a gold standard in human motion analysis, showing improved heading

accuracy for the MARG-sensor data fusion up to a factor of 0.5 while magnetic disturbance is present.

Keywords: data fusion; MARG; IMU; eye tracker; self-contained; head motion measurement