This dataset was created in order to evaluate different models for detecting the driver's current object of fixation, i.e. finding the object the driver is looking at, when using a remote gaze tracking system. Determining the tracking quality of the remote gaze tracking system does not assess the advantages and drawbacks of specific algorithmic fusion approaches. Furthermore, when estimating the driver's point of regard (PoR) and the gaze target, all algorithmic approaches share the problem that there exists no ground truth on where the driver is truly looking.
For this purpose, a wearable gaze tracking device was operated in parallel to the vehicle-integrated head-eye-tracking system, serving as source for reference data of the driver's visual attention.
The dataset contains:
remote gaze direction measurements, stereo image recordings, and object lists of several artificial and real world scenarios as recorded by the PRORETA 4 test vehicle,
images and point of regard as measured by the wearable eye tracking device,
some sequences are labeled as outlined in the associated paper,
raw data of the real world drive (~5min),
more information in the Description.txt of the dataset.
If you use this dataset in your research please cite the associated publication:
Julian Schwehr, Moritz Knaust, and Volker Willert: “How to Evaluate Object-of-Fixation Detection”, IEEE Intelligent Vehicles Symposium (IV), 2019.
Read the Paper at IEEE Xplore
BibTex:
@inproceedings{Schwehr.2019,
author = {Schwehr, Julian and Knaust, Moritz and Willert, Volker},
title = {How to Evaluate Object-of-Fixation Detection},
booktitle = {IEEE Intelligent Vehicles Symposium (IV)},
year = {2019}
}
The mentioned gaze target tracking model is introduced in:Multi-Hypothesis Multi-Model Driver's Gaze Target Tracking.