SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Pelz Jeff B.) "

Search: WFRF:(Pelz Jeff B.)

  • Result 1-2 of 2
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Wang, Dong, et al. (author)
  • A study of artificial eyes for the measurement of precision in eye-trackers
  • 2017
  • In: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 49:3, s. 947-959
  • Journal article (peer-reviewed)abstract
    • The precision of an eye-tracker is critical to the correct identification of eye movements and their properties. To measure a system’s precision, artificial eyes (AEs) are often used, to exclude eye movements influencing the measurements. A possible issue, however, is that it is virtually impossible to construct AEs with sufficient complexity to fully represent the human eye. To examine the consequences of this limitation, we tested currently used AEs from three manufacturers of eye-trackers and compared them to a more complex model, using 12 commercial eye-trackers. Because precision can be measured in various ways, we compared different metrics in the spatial domain and analyzed the power-spectral densities in the frequency domain. To assess how precision measurements compare in artificial and human eyes, we also measured precision using human recordings on the same eye-trackers. Our results show that the modified eye model presented can cope with all eye-trackers tested and acts as a promising candidate for further development of a set of AEs with varying pupil size and pupil–iris contrast. The spectral analysis of both the AE and human data revealed that human eye data have different frequencies that likely reflect the physiological characteristics of human eye movements. We also report the effects of sample selection methods for precision calculations. This study is part of the EMRA/COGAIN Eye Data Quality Standardization Project.
  •  
2.
  • Wang, Dong, et al. (author)
  • Characterization and reconstruction of VOG noise with power spectral density analysis
  • 2016
  • In: Proceedings - ETRA 2016: 2016 ACM Symposium on Eye Tracking Research and Applications. - New York, NY, USA : ACM. - 9781450341257 ; 14, s. 217-220
  • Conference paper (peer-reviewed)abstract
    • Characterizing noise in eye movement data is important for data analysis, as well as for the comparison of research results across systems. We present a method that characterizes and reconstructs the noise in eye movement data from video-oculography (VOG) systems taking into account the uneven sampling in real recordings due to track loss and inherent system features. The proposed method extends the Lomb-Scargle periodogram, which is used for the estimation of the power spectral density (PSD) of unevenly sampled data [Hocke and Kampfer 2009]. We estimate the PSD of fixational eye movement data and reconstruct the noise by applying a random phase to the inverse Fourier transform so that the reconstructed signal retains the amplitude of the original noise at each frequency. We apply this method to the EMRA/COGAIN Eye Data Quality Standardization project's dataset, which includes recordings from 11 commercially available VOG systems and a Dual Pukinje Image (DPI) eye tracker. The reconstructed noise from each VOG system was superimposed onto the DPI data and the resulting eye movement measures from the same original behaviors were compared.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-2 of 2
Type of publication
conference paper (1)
journal article (1)
Type of content
peer-reviewed (2)
Author/Editor
Wang, Dong (2)
Pelz, Jeff B. (2)
Holmqvist, Kenneth (1)
Mulvey, Fiona (1)
Mulvey, Fiona B. (1)
University
Lund University (2)
Language
English (2)
Research subject (UKÄ/SCB)
Natural sciences (1)
Social Sciences (1)
Humanities (1)

Year

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view