SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Hessels Roy S) "

Sökning: WFRF:(Hessels Roy S)

  • Resultat 1-29 av 29
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Abdo, A. A., et al. (författare)
  • The second Fermi large area telescope catalog of gamma-ray pulsars
  • 2013
  • Ingår i: Astrophysical Journal Supplement Series. - : American Astronomical Society. - 0067-0049 .- 1538-4365. ; 208:2, s. 17-
  • Tidskriftsartikel (refereegranskat)abstract
    • This catalog summarizes 117 high-confidence ≥0.1 GeV gamma-ray pulsar detections using three years of data acquired by the Large Area Telescope (LAT) on the Fermi satellite. Half are neutron stars discovered using LAT data through periodicity searches in gamma-ray and radio data around LAT unassociated source positions. The 117 pulsars are evenly divided into three groups: millisecond pulsars, young radio-loud pulsars, and young radio-quiet pulsars. We characterize the pulse profiles and energy spectra and derive luminosities when distance information exists. Spectral analysis of the off-peak phase intervals indicates probable pulsar wind nebula emission for four pulsars, and off-peak magnetospheric emission for several young and millisecond pulsars. We compare the gamma-ray properties with those in the radio, optical, and X-ray bands. We provide flux limits for pulsars with no observed gamma-ray emission, highlighting a small number of gamma-faint, radio-loud pulsars. The large, varied gamma-ray pulsar sample constrains emission models. Fermi's selection biases complement those of radio surveys, enhancing comparisons with predicted population distributions.
  •  
2.
  • Dunn, Matt J, et al. (författare)
  • Minimal reporting guideline for research involving eye tracking (2023 edition)
  • Ingår i: Behavior Research Methods. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • A guideline is proposed that comprises the minimum items to be reported in research studies involving an eye tracker and human or non-human primate participant(s). This guideline was developed over a 3-year period using a consensus-based process via an open invitation to the international eye tracking community. This guideline will be reviewed at maximum intervals of 4 years.
  •  
3.
  • Hessels, Roy S, et al. (författare)
  • Eye contact avoidance in crowds : A large wearable eye-tracking study
  • 2022
  • Ingår i: Attention, Perception & Psychophysics. - : Springer Science and Business Media LLC. - 1943-3921 .- 1943-393X. ; 84:8, s. 2623-2640
  • Tidskriftsartikel (refereegranskat)abstract
    • Eye contact is essential for human interactions. We investigated whether humans are able to avoid eye contact while navigating crowds. At a science festival, we fitted 62 participants with a wearable eye tracker and instructed them to walk a route. Half of the participants were further instructed to avoid eye contact. We report that humans can flexibly allocate their gaze while navigating crowds and avoid eye contact primarily by orienting their head and eyes towards the floor. We discuss implications for crowd navigation and gaze behavior. In addition, we address a number of issues encountered in such field studies with regard to data quality, control of the environment, and participant adherence to instructions. We stress that methodological innovation and scientific progress are strongly interrelated.
  •  
4.
  • Hessels, Roy S., et al. (författare)
  • Task-related gaze behaviour in face-to-face dyadic collaboration : Toward an interactive theory?
  • 2023
  • Ingår i: Visual Cognition. - 1350-6285. ; 31:4, s. 291-313
  • Tidskriftsartikel (refereegranskat)abstract
    • Visual routines theory posits that vision is critical for guiding sequential actions in the world. Most studies on the link between vision and sequential action have considered individual agents, while substantial human behaviour is characterized by multi-party interaction. Here, the actions of each person may affect what the other can subsequently do. We investigated task execution and gaze allocation of 19 dyads completing a Duplo-model copying task together, while wearing the Pupil Invisible eye tracker. We varied whether all blocks were visible to both participants, and whether verbal communication was allowed. For models in which not all blocks were visible, participants seemed to coordinate their gaze: The distance between the participants' gaze positions was smaller and dyads looked longer at the model concurrently than for models in which all blocks were visible. This was most pronounced when verbal communication was allowed. We conclude that the way the collaborative task was executed depended both on whether visual information was available to both persons, and how communication took place. Modelling task structure and gaze allocation for human-human and human-robot collaboration thus requires more than the observable behaviour of either individual. We discuss whether an interactive visual routines theory ought to be pursued.
  •  
5.
  •  
6.
  • Holmqvist, Kenneth, et al. (författare)
  • Eye tracking : empirical foundations for a minimal reporting guideline
  • 2023
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 55:1, s. 364-416
  • Tidskriftsartikel (refereegranskat)abstract
    • In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines of any study involving an eye tracker. We compare this empirical foundation to five existing reporting guidelines and to a database of 207 published eye-tracking studies. We find that reporting guidelines vary substantially and do not match with actual reporting practices. We end by deriving a minimal, flexible reporting guideline based on empirical research (Section "An empirically based minimal reporting guideline").
  •  
7.
  • Hooge, Ignace T C, et al. (författare)
  • How robust are wearable eye trackers to slow and fast head and body movements?
  • 2023
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 55:8
  • Tidskriftsartikel (refereegranskat)abstract
    • How well can modern wearable eye trackers cope with head and body movement? To investigate this question, we asked four participants to stand still, walk, skip, and jump while fixating a static physical target in space. We did this for six different eye trackers. All the eye trackers were capable of recording gaze during the most dynamic episodes (skipping and jumping). The accuracy became worse as movement got wilder. During skipping and jumping, the biggest error was 5.8 ∘. However, most errors were smaller than 3 ∘. We discuss the implications of decreased accuracy in the context of different research scenarios.
  •  
8.
  • Niehorster, Diederick C, et al. (författare)
  • GlassesValidator : A data quality tool for eye tracking glasses
  • Ingår i: Behavior Research Methods. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • According to the proposal for a minimum reporting guideline for an eye tracking study by Holmqvist et al. (2022), the accuracy (in degrees) of eye tracking data should be reported. Currently, there is no easy way to determine accuracy for wearable eye tracking recordings. To enable determining the accuracy quickly and easily, we have produced a simple validation procedure using a printable poster and accompanying Python software. We tested the poster and procedure with 61 participants using one wearable eye tracker. In addition, the software was tested with six different wearable eye trackers. We found that the validation procedure can be administered within a minute per participant and provides measures of accuracy and precision. Calculating the eye-tracking data quality measures can be done offline on a simple computer and requires no advanced computer skills.
  •  
9.
  • Niehorster, Diederick C., et al. (författare)
  • GlassesViewer : Open-source software for viewing and analyzing data from the Tobii Pro Glasses 2 eye tracker
  • 2020
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • We present GlassesViewer, open-source software for viewing and analyzing eye-tracking data of the Tobii Pro Glasses 2 head-mounted eye tracker as well as the scene and eye videos and other data streams (pupil size, gyroscope, accelerometer, and TTL input) that this headset can record. The software provides the following functionality written in MATLAB: (1) a graphical interface for navigating the study- and recording structure produced by the Tobii Glasses 2; (2) functionality to unpack, parse, and synchronize the various data and video streams comprising a Glasses 2 recording; and (3) a graphical interface for viewing the Glasses 2’s gaze direction, pupil size, gyroscope and accelerometer time-series data, along with the recorded scene and eye camera videos. In this latter interface, segments of data can furthermore be labeled through user-provided event classification algorithms or by means of manual annotation. Lastly, the toolbox provides integration with the GazeCode tool by Benjamins et al. (2018), enabling a completely open-source workflow for analyzing Tobii Pro Glasses 2 recordings.
  •  
10.
  • De Kloe, Yentl J.R., et al. (författare)
  • Replacing eye trackers in ongoing studies : A comparison of eye‐tracking data quality between the Tobii Pro TX300 and the Tobii Pro Spectrum
  • 2021
  • Ingår i: Infancy. - : Wiley. - 1532-7078 .- 1525-0008.
  • Tidskriftsartikel (refereegranskat)abstract
    • The Tobii Pro TX300 is a popular eye tracker in developmental eye-tracking research, yet it is no longer manufactured. If a TX300 breaks down, it may have to be replaced. The data quality of the replacement eye tracker may differ from that of the TX300, which may affect the experimental outcome measures. This is problematic for longitudinal and multi-site studies, and for researchers replacing eye trackers between studies. We, therefore, ask how the TX300 and its successor, the Tobii Pro Spectrum, compare in terms of eye-tracking data quality. Data quality—operationalized through precision, accuracy, and data loss—was compared between eye trackers for three age groups (around 5-months, 10-months, and 3-years). Precision was better for all gaze position signals obtained with the Spectrum in comparison to the TX300. Accuracy of the Spectrum was higher for the 5-month-old and 10-month-old children. For the three-year-old children, accuracy was similar across both eye trackers. Gaze position signals from the Spectrum exhibited lower proportions of data loss, and the duration of the data loss periods tended to be shorter. In conclusion, the Spectrum produces gaze position signals with higher data quality, especially for the younger infants. Implications for data analysis are discussed.
  •  
11.
  •  
12.
  • Hessels, Roy S., et al. (författare)
  • Is the eye-movement field confused about fixations and saccades? : A survey among 124 researchers
  • 2018
  • Ingår i: Royal Society Open Science. - : The Royal Society. - 2054-5703. ; 5:8, s. 1-23
  • Tidskriftsartikel (refereegranskat)abstract
    • Eye movements have been extensively studied in a wide range of research fields. While new methods such as mobile eye tracking and eye tracking in virtual/augmented realities are emerging quickly, the eye-movement terminology has scarcely been revised. We assert that this may cause confusion about two of the main concepts: fixations and saccades. In this study, we assessed the definitions of fixations and saccades held in the eye-movement field, by surveying 124 eye-movement researchers. These eye-movement researchers held a variety of definitions of fixations and saccades, of which the breadth seems even wider than what is reported in the literature. Moreover, these definitions did not seem to be related to researcher background or experience. We urge researchers to make their definitions more explicit by specifying all the relevant components of the eye movement under investigation: (i) the oculomotor component: e.g. whether the eye moves slow or fast; (ii) the functional component: what purposes does the eye movement (or lack thereof) serve; (iii) the coordinate system used: relative to what does the eye move; (iv) the computational definition: how is the event represented in the eye-tracker signal. This should enable eye-movement researchers from different fields to have a discussion without misunderstandings.
  •  
13.
  •  
14.
  • Hessels, Roy S., et al. (författare)
  • Noise-robust fixation detection in eye movement data : Identification by two-means clustering (I2MC)
  • 2017
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 49:5, s. 1802-1823
  • Tidskriftsartikel (refereegranskat)abstract
    • Eye-tracking research in infants and older children has gained a lot of momentum over the last decades. Although eye-tracking research in these participant groups has become easier with the advance of the remote eye-tracker, this often comes at the cost of poorer data quality than in research with well-trained adults (Hessels, Andersson, Hooge, Nyström, & Kemner Infancy, 20, 601-633, 2015; Wass, Forssman, & Leppänen Infancy, 19, 427-460, 2014). Current fixation detection algorithms are not built for data from infants and young children. As a result, some researchers have even turned to hand correction of fixation detections (Saez de Urabain, Johnson, & Smith Behavior Research Methods, 47, 53-72, 2015). Here we introduce a fixation detection algorithm-identification by two-means clustering (I2MC)-built specifically for data across a wide range of noise levels and when periods of data loss may occur. We evaluated the I2MC algorithm against seven state-of-the-art event detection algorithms, and report that the I2MC algorithm's output is the most robust to high noise and data loss levels. The algorithm is automatic, works offline, and is suitable for eye-tracking data recorded with remote or tower-mounted eye-trackers using static stimuli. In addition to application of the I2MC algorithm in eye-tracking research with infants, school children, and certain patient groups, the I2MC algorithm also may be useful when the noise and data loss levels are markedly different between trials, participants, or time points (e.g., longitudinal research).
  •  
15.
  • Hooge, Ignace, et al. (författare)
  • Is human classification by experienced untrained observers a gold standard in fixation detection?
  • 2018
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 50:5, s. 1864-1881
  • Tidskriftsartikel (refereegranskat)abstract
    • Despite early reports and the contemporary consensus on microsaccades as purely binocular phenomena, recent work has proposed not only the existence of monocular microsaccades, but also that they serve functional purposes. We take a critical look at the detection of monocular microsaccades from a signal perspective, using raw data and a state-of-the-art, video-based eye tracker. In agreement with previous work, monocular detections were present in all participants using a standard microsaccade detection algorithm. However, a closer look at the raw data invalidates the vast majority of monocular detections. These results again raise the question of the existence of monocular microsaccades, as well as the need for improved methods to study small eye movements recorded with video-based eye trackers.
  •  
16.
  • Hooge, Ignace T.C., et al. (författare)
  • Do pupil-based binocular video eye trackers reliably measure vergence?
  • 2019
  • Ingår i: Vision Research. - : Elsevier BV. - 0042-6989. ; 156, s. 1-9
  • Tidskriftsartikel (refereegranskat)abstract
    • A binocular eye tracker needs to be accurate to enable the determination of vergence, distance to the binocular fixation point and fixation disparity. These measures are useful in e.g. the research fields of visual perception, binocular control in reading and attention in 3D. Are binocular pupil-based video eye trackers accurate enough to produce meaningful binocular measures? Recent research revealed potentially large idiosyncratic systematic errors due to pupil-size changes. With a top of the line eye tracker (SR Research EyeLink 1000 plus), we investigated whether the pupil-size artefact in the separate eyes may cause the eye tracker to report apparent vergence when the eyeballs do not rotate. Participants were asked to fixate a target at a distance of 77 cm for 160 s. We evoked pupil-size changes by varying the light intensity. With increasing pupil size, horizontal vergence reported by the eye tracker decreased in most subjects, up to two degrees. However, this was not due to a rotation of the eyeballs, as identified from the absence of systematic movement in the corneal reflection (CR) signals. From this, we conclude that binocular pupil-CR or pupil-only video eye trackers using the dark pupil technique are not accurate enough to be used to determine vergence, distance to the binocular fixation point and fixation disparity.
  •  
17.
  • Hooge, Ignace T. C., et al. (författare)
  • Fixation classification: how to merge and select fixation candidates
  • 2022
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 54:6, s. 2765-2776
  • Tidskriftsartikel (refereegranskat)abstract
    • Eye trackers are applied in many research fields (e.g., cognitive science, medicine, marketing research). To give meaning to the eye-tracking data, researchers have a broad choice of classification methods to extract various behaviors (e.g., saccade, blink, fixation) from the gaze signal. There is extensive literature about the different classification algorithms. Surprisingly, not much is known about the effect of fixation and saccade selection rules that are usually (implicitly) applied. We want to answer the following question: What is the impact of the selection-rule parameters (minimal saccade amplitude and minimal fixation duration) on the distribution of fixation durations? To answer this question, we used eye-tracking data with high and low quality and seven different classification algorithms. We conclude that selection rules play an important role in merging and selecting fixation candidates. For eye-tracking data with good-to-moderate precision (RMSD < 0.5∘), the classification algorithm of choice does not matter too much as long as it is sensitive enough and is followed by a rule that selects saccades with amplitudes larger than 1.0∘ and a rule that selects fixations with duration longer than 60 ms. Because of the importance of selection, researchers should always report whether they performed selection and the values of their parameters.
  •  
18.
  •  
19.
  • Hooge, Ignace T C, et al. (författare)
  • Large eye-head gaze shifts measured with a wearable eye tracker and an industrial camera
  • Ingår i: Behavior Research Methods. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • We built a novel setup to record large gaze shifts (up to 140[Formula: see text]). The setup consists of a wearable eye tracker and a high-speed camera with fiducial marker technology to track the head. We tested our setup by replicating findings from the classic eye-head gaze shift literature. We conclude that our new inexpensive setup is good enough to investigate the dynamics of large eye-head gaze shifts. This novel setup could be used for future research on large eye-head gaze shifts, but also for research on gaze during e.g., human interaction. We further discuss reference frames and terminology in head-free eye tracking. Despite a transition from head-fixed eye tracking to head-free gaze tracking, researchers still use head-fixed eye movement terminology when discussing world-fixed gaze phenomena. We propose to use more specific terminology for world-fixed phenomena, including gaze fixation, gaze pursuit, and gaze saccade.
  •  
20.
  • Hooge, Ignace T C, et al. (författare)
  • The pupil-size artefact (PSA) across time, viewing direction, and different eye trackers
  • 2021
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 53:5, s. 1986-2006
  • Tidskriftsartikel (refereegranskat)abstract
    • The pupil size artefact (PSA) is the gaze deviation reported by an eye tracker during pupil size changes if the eye does not rotate. In the present study, we ask three questions: 1) how stable is the PSA over time, 2) does the PSA depend on properties of the eye tracker set up, and 3) does the PSA depend on the participants' viewing direction? We found that the PSA is very stable over time for periods as long as 1 year, but may differ between participants. When comparing the magnitude of the PSA between eye trackers, we found the magnitude of the obtained PSA to be related to the direction of the eye-tracker-camera axis, suggesting that the angle between the participants' viewing direction and the camera axis affects the PSA. We then investigated the PSA as a function of the participants' viewing direction. The PSA was non-zero for viewing direction 0∘ and depended on the viewing direction. These findings corroborate the suggestion by Choe et al. (Vision Research 118(6755):48-59, 2016), that the PSA can be described by an idiosyncratic and a viewing direction-dependent component. Based on a simulation, we cannot claim that the viewing direction-dependent component of the PSA is caused by the optics of the cornea.
  •  
21.
  • Niehorster, Diederick C., et al. (författare)
  • The impact of slippage on the data quality of head-worn eye trackers
  • 2020
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • Mobile head-worn eye trackers allow researchers to record eye-movement data as participants freely move around and interact with their surroundings. However, participant behavior may cause the eye tracker to slip on the participant’s head, potentially strongly affecting data quality. To investigate how this eye-tracker slippage affects data quality, we designed experiments in which participants mimic behaviors that can cause a mobile eye tracker to move. Specifically, we investigated data quality when participants speak, make facial expressions, and move the eye tracker. Four head-worn eye-tracking setups were used: (i) Tobii Pro Glasses 2 in 50 Hz mode, (ii) SMI Eye Tracking Glasses 2.0 60 Hz, (iii) Pupil-Labs’ Pupil in 3D mode, and (iv) Pupil-Labs’ Pupil with the Grip gaze estimation algorithm as implemented in the EyeRecToo software. Our results show that whereas gaze estimates of the Tobii and Grip remained stable when the eye tracker moved, the other systems exhibited significant errors (0.8–3.1∘ increase in gaze deviation over baseline) even for the small amounts of glasses movement that occurred during the speech and facial expressions tasks. We conclude that some of the tested eye-tracking setups may not be suitable for investigating gaze behavior when high accuracy is required, such as during face-to-face interaction scenarios. We recommend that users of mobile head-worn eye trackers perform similar tests with their setups to become aware of its characteristics. This will enable researchers to design experiments that are robust to the limitations of their particular eye-tracking setup.
  •  
22.
  •  
23.
  • Niehorster, Diederick C, et al. (författare)
  • What to expect from your remote eye-tracker when participants are unrestrained
  • 2018
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 50:1, s. 213-227
  • Tidskriftsartikel (refereegranskat)abstract
    • The marketing materials of remote eye-trackers suggest that data quality is invariant to the position and orientation of the participant as long as the eyes of the participant are within the eye-tracker's headbox, the area where tracking is possible. As such, remote eye-trackers are marketed as allowing the reliable recording of gaze from participant groups that cannot be restrained, such as infants, schoolchildren and patients with muscular or brain disorders. Practical experience and previous research, however, tells us that eye-tracking data quality, e.g. the accuracy of the recorded gaze position and the amount of data loss, deteriorates (compared to well-trained participants in chinrests) when the participant is unrestrained and assumes a non-optimal pose in front of the eye-tracker. How then can researchers working with unrestrained participants choose an eye-tracker? Here we investigated the performance of five popular remote eye-trackers from EyeTribe, SMI, SR Research, and Tobii in a series of tasks where participants took on non-optimal poses. We report that the tested systems varied in the amount of data loss and systematic offsets observed during our tasks. The EyeLink and EyeTribe in particular had large problems. Furthermore, the Tobii eye-trackers reported data for two eyes when only one eye was visible to the eye-tracker. This study provides practical insight into how popular remote eye-trackers perform when recording from unrestrained participants. It furthermore provides a testing method for evaluating whether a tracker is suitable for studying a certain target population, and that manufacturers can use during the development of new eye-trackers.
  •  
24.
  • Nyström, Marcus, et al. (författare)
  • The amplitude of small eye movements can be accurately estimated with video-based eye trackers
  • 2023
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 55:2, s. 657-669
  • Tidskriftsartikel (refereegranskat)abstract
    • Estimating the gaze direction with a digital video-based pupil and corneal reflection (P-CR) eye tracker is challenging partly since a video camera is limited in terms of spatial and temporal resolution, and because the captured eye images contain noise. Through computer simulation, we evaluated the localization accuracy of pupil-, and CR centers in the eye image for small eye rotations (≪ 1 deg). Results highlight how inaccuracies in center localization are related to 1) how many pixels the pupil and CR span in the eye camera image, 2) the method to compute the center of the pupil and CRs, and 3) the level of image noise. Our results provide a possible explanation to why the amplitude of small saccades may not be accurately estimated by many currently used video-based eye trackers. We conclude that eye movements with arbitrarily small amplitudes can be accurately estimated using the P-CR eye-tracking principle given that the level of image noise is low and the pupil and CR span enough pixels in the eye camera, or if localization of the CR is based on the intensity values in the eye image instead of a binary representation.
  •  
25.
  • Nyström, Marcus, et al. (författare)
  • What is a blink? : Classifying and characterizing blinks in eye openness signals
  • Ingår i: Behavior Research Methods. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • Blinks, the closing and opening of the eyelids, are used in a wide array of fields where human function and behavior are studied. In data from video-based eye trackers, blink rate and duration are often estimated from the pupil-size signal. However, blinks and their parameters can be estimated only indirectly from this signal, since it does not explicitly contain information about the eyelid position. We ask whether blinks detected from an eye openness signal that estimates the distance between the eyelids (EO blinks) are comparable to blinks detected with a traditional algorithm using the pupil-size signal (PS blinks) and how robust blink detection is when data quality is low. In terms of rate, there was an almost-perfect overlap between EO and PS blink (F1 score: 0.98) when the head was in the center of the eye tracker's tracking range where data quality was high and a high overlap (F1 score 0.94) when the head was at the edge of the tracking range where data quality was worse. When there was a difference in blink rate between EO and PS blinks, it was mainly due to data loss in the pupil-size signal. Blink durations were about 60 ms longer in EO blinks compared to PS blinks. Moreover, the dynamics of EO blinks was similar to results from previous literature. We conclude that the eye openness signal together with our proposed blink detection algorithm provides an advantageous method to detect and describe blinks in greater detail.
  •  
26.
  • Valtakari, Niilo V., et al. (författare)
  • A field test of computer-vision-based gaze estimation in psychology
  • 2024
  • Ingår i: Behavior Research Methods. - : Springer. - 1554-351X .- 1554-3528. ; 56:3, s. 1900-1915
  • Tidskriftsartikel (refereegranskat)abstract
    • Computer-vision-based gaze estimation refers to techniques that estimate gaze direction directly from video recordings of the eyes or face without the need for an eye tracker. Although many such methods exist, their validation is often found in the technical literature (e.g., computer science conference papers). We aimed to (1) identify which computer-vision-based gaze estimation methods are usable by the average researcher in fields such as psychology or education, and (2) evaluate these methods. We searched for methods that do not require calibration and have clear documentation. Two toolkits, OpenFace and OpenGaze, were found to fulfill these criteria. First, we present an experiment where adult participants fixated on nine stimulus points on a computer screen. We filmed their face with a camera and processed the recorded videos with OpenFace and OpenGaze. We conclude that OpenGaze is accurate and precise enough to be used in screen-based experiments with stimuli separated by at least 11 degrees of gaze angle. OpenFace was not sufficiently accurate for such situations but can potentially be used in sparser environments. We then examined whether OpenFace could be used with horizontally separated stimuli in a sparse environment with infant participants. We compared dwell measures based on OpenFace estimates to the same measures based on manual coding. We conclude that OpenFace gaze estimates may potentially be used with measures such as relative total dwell time to sparse, horizontally separated areas of interest, but should not be used to draw conclusions about measures such as dwell duration.
  •  
27.
  • Valtakari, Niilo V., et al. (författare)
  • Eye tracking in human interaction : Possibilities and limitations
  • 2021
  • Ingår i: Behavior Research Methods. - : Springer Nature. - 1554-351X .- 1554-3528. ; 53:4, s. 1592-1608
  • Tidskriftsartikel (refereegranskat)abstract
    • There is a long history of interest in looking behavior during human interaction. With the advance of (wearable) video-based eye trackers, it has become possible to measure gaze during many different interactions. We outline the different types of eye-tracking setups that currently exist to investigate gaze during interaction. The setups differ mainly with regard to the nature of the eye-tracking signal (head- or world-centered) and the freedom of movement allowed for the participants. These features place constraints on the research questions that can be answered about human interaction. We end with a decision tree to help researchers judge the appropriateness of specific setups.
  •  
28.
  • Valtakari, Niilo V., et al. (författare)
  • Eye Tracking in Human Interaction : Possibilities and Limitations
  • 2020
  • Ingår i: Companion Publicaton of the 2020 International Conference on Multimodal Interaction (ICMI '20 Companion). - New York, NY, USA : Association for Computing Machinery (ACM). - 9781450380027 ; , s. 508-508
  • Konferensbidrag (refereegranskat)
  •  
29.
  • Viktorsson, Charlotte, et al. (författare)
  • Stable eye versus mouth preference in a live speech-processing task
  • 2023
  • Ingår i: Scientific Reports. - : Springer. - 2045-2322. ; 13:1
  • Tidskriftsartikel (refereegranskat)abstract
    • Looking at the mouth region is thought to be a useful strategy for speech-perception tasks. The tendency to look at the eyes versus the mouth of another person during speech processing has thus far mainly been studied using screen-based paradigms. In this study, we estimated the eye-mouth-index (EMI) of 38 adult participants in a live setting. Participants were seated across the table from an experimenter, who read sentences out loud for the participant to remember in both a familiar (English) and unfamiliar (Finnish) language. No statistically significant difference in the EMI between the familiar and the unfamiliar languages was observed. Total relative looking time at the mouth also did not predict the number of correctly identified sentences. Instead, we found that the EMI was higher during an instruction phase than during the speech-processing task. Moreover, we observed high intra-individual correlations in the EMI across the languages and different phases of the experiment. We conclude that there are stable individual differences in looking at the eyes versus the mouth of another person. Furthermore, this behavior appears to be flexible and dependent on the requirements of the situation (speech processing or not).
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-29 av 29

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy