Show simple item record

dc.contributor.authorGriffith, Henry ( )
dc.contributor.authorKatrychuk, Dmytro ( )
dc.contributor.authorKomogortsev, Oleg ( )
dc.date.accessioned2019-08-30T14:42:56Z
dc.date.available2019-08-30T14:42:56Z
dc.date.issued2019-08-27
dc.identifier.citationGriffith, H. K., Katrychuk, D., Komogortsev, O. V. (2019). Assessment of shift-invariant CNN gaze mappings for PS-OG eye movement sensors. Paper accepted to be published in the 2019 OpenEDS Workshop: Eye Tracking for VR and AR at the International Conference on Computer Vision (ICCV), Seoul, South Korea.
dc.identifier.urihttps://digital.library.txstate.edu/handle/10877/8574
dc.descriptionThis contains supplementary material for the paper entitled "Assessment of Shift-Invariant CNN Gaze Mappings for PS-OG Eye Movement Sensors," which as been accepted in the Proceedings of the Eye Tracking for VR and AR Workshop, which is part of the 2019 International Conference on Computer Vision (ICCV). Full details regarding the contents of each directory and instructions for preparing the processing environment are provided in the "README.md" file, which is contained within the code base directory. The up-to-date codebase is available on GitHub at the following link:
https://github.com/pseudowolfvn/psog_nn/tree/etra2019
en_US
dc.description.abstractPhotosensor oculography (PS-OG) eye movement sensors offer desirable performance characteristics for integration within wireless head mounted devices (HMDs), including low power consumption and high sampling rates.To address the known performance degradation of these sensors due to HMD shifts, various machine learning techniques have been proposed for mapping sensor outputs to gaze location. This paper advances the understanding of a recently introduced convolutional neural network designed to provide shift invariant gaze mapping within a specified range of sensor translations. Performance is assessed for shift training examples which better reflect the distribution of values that would be generated through manual repositioning of the HMD. The network is shown to exhibit com-parable accuracy for this realistic shift distribution versus a previously considered rectangular grid, thereby enhancing the feasibility of in-field initialization. In addition, this work further supports the practical viability of the proposed initialization process by demonstrating robust mapping performance versus training data scale. The ability to maintain reasonable accuracy for shifts extending beyond those introduced during training is also demonstrated.en_US
dc.formatText
dc.format.extent10 pages
dc.format.medium1 file (.pdf)
dc.publisherIEEE Computer Societyen_US
dc.sourceInternational Conference on Computer Vision, October 27 - November 3, 2019, Seoul, South Korea
dc.subjectEye trackingen_US
dc.subjectMachine learning
dc.subjectPhotosensor oculography
dc.titleAssessment of Shift-Invariant CNN Gaze Mappings for PS-OG Eye Movement Sensorsen_US
txstate.documenttypePaper
dc.description.versionThis is the Author Accepted Manuscript version. Copyright © IEEE.
txstate.departmentComputer Science


Download

Thumbnail

This item appears in the following Collection(s)

Show simple item record