SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Kasneci Enkelejda) "

Sökning: WFRF:(Kasneci Enkelejda)

  • Resultat 1-7 av 7
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Byrne, Sean Anthony, et al. (författare)
  • From Lenses to Living Rooms : A Policy Brief on Eye Tracking in XR Before the Impending Boom
  • 2024
  • Ingår i: 2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality, AIxVR 2024. - 9798350372021 ; , s. 90-96
  • Konferensbidrag (refereegranskat)abstract
    • As tech giants such as Apple and Meta invest heavily in Virtual and Augmented Reality (VR/AR) technologies, often collectively termed Extended Reality (XR) devices, a significant societal concern emerges: The use of eye-tracking technology within these devices. Gaze data holds immense value, revealing insights into user attention, health, and cognitive states. This raises substantial concerns over privacy and fairness, with potential risks of targeted ads, unauthorized surveillance, and data re-purposing. As the impact of eye tracking is due to transition from the lab to the broader public, this paper underscores these pivotal issues in a digestible manner to the general audience. To this end, we first outline the eye-tracking data collection process and its potential for user insights. Second, we introduce perspectives from the domain of privacy, emphasizing its significance as a pivotal measure to guard against the improper use of eye-tracking data. Third, we provide a set of guidelines created by researchers actively working within this space. These recommendations are designed to guide policymakers and the general public toward establishing informed, equitable, and privacy-centric standards surrounding these devices.
  •  
2.
  • Byrne, Sean Anthony, et al. (författare)
  • Precise localization of corneal reflections in eye images using deep learning trained on synthetic data
  • Ingår i: Behavior Research Methods. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • We present a deep learning method for accurately localizing the center of a single corneal reflection (CR) in an eye image. Unlike previous approaches, we use a convolutional neural network (CNN) that was trained solely using synthetic data. Using only synthetic data has the benefit of completely sidestepping the time-consuming process of manual annotation that is required for supervised training on real eye images. To systematically evaluate the accuracy of our method, we first tested it on images with synthetic CRs placed on different backgrounds and embedded in varying levels of noise. Second, we tested the method on two datasets consisting of high-quality videos captured from real eyes. Our method outperformed state-of-the-art algorithmic methods on real eye images with a 3-41.5% reduction in terms of spatial precision across data sets, and performed on par with state-of-the-art on synthetic images in terms of spatial accuracy. We conclude that our method provides a precise method for CR center localization and provides a solution to the data availability problem, which is one of the important common roadblocks in the development of deep learning models for gaze estimation. Due to the superior CR center localization and ease of application, our method has the potential to improve the accuracy and precision of CR-based eye trackers.
  •  
3.
  • Holmqvist, Kenneth, et al. (författare)
  • Eye tracking : empirical foundations for a minimal reporting guideline
  • 2023
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 55:1, s. 364-416
  • Tidskriftsartikel (refereegranskat)abstract
    • In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines of any study involving an eye tracker. We compare this empirical foundation to five existing reporting guidelines and to a database of 207 published eye-tracking studies. We find that reporting guidelines vary substantially and do not match with actual reporting practices. We end by deriving a minimal, flexible reporting guideline based on empirical research (Section "An empirically based minimal reporting guideline").
  •  
4.
  • Maquiling, Virmarie, et al. (författare)
  • V-ir-Net : A Novel Neural Network for Pupil and Corneal Reflection Detection trained on Simulated Light Distributions
  • 2023
  • Ingår i: MobileHCI '23 Companion : Proceedings of the 25th International Conference on Mobile Human-Computer Interaction - Proceedings of the 25th International Conference on Mobile Human-Computer Interaction. - 9781450399241 ; , s. 1-7
  • Konferensbidrag (refereegranskat)abstract
    • Deep learning has shown promise for gaze estimation in Virtual Reality (VR) and other head-mounted applications, but such models are hard to train due to lack of available data. Here we introduce a novel method to train neural networks for gaze estimation using synthetic images that model the light distributions captured in a P-CR setup. We tested our model on a dataset of real eye images from a VR setup, achieving 76% accuracy which is close to the state-of-the-art model which was trained on the dataset itself. The localization error for CRs was 1.56 pixels and 2.02 pixels for the pupil, which is on par with state-of-the-art. Our approach allowed inference on the whole dataset without sacrificing data for model training. Our method provides a cost-efficient and lightweight training alternative, eliminating the need for hand-labeled data. It offers flexible customization, e.g. adapting to different illuminator configurations, with minimal code changes.
  •  
5.
  • Navarro, Diego (författare)
  • Biofeedback Interaction : Applying Physiological Methods to Entertainment Video Games
  • 2020
  • Licentiatavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Biofeedback interaction offers interesting opportunities for video games since it allows player physiological information to be used in novel interaction techniques. Despite several contributions in the area, biofeedback interaction faces a set of challenges relating to its design and implementation. First, it has mainly been used as a method to replace more traditional interaction devices, such as gamepads, mice or keyboards. Also, few of the previous interaction techniques have made an essential use of physiological data: exploring possibilities that could only be developed by involving physiological inputs.This dissertation explores how different physiological methods, such as electroencephalography, eye tracking, electrocardiography, electrodermal activity, or electromyography, could be used in the design and development of natural user interaction techniques that might be applied to entertainment video games, highlighting technical details for the appropriate use of physiological signals. The research also discusses interaction design principles from a human-computer interaction perspective, evaluates several novel biofeedback interaction techniques with a set of user studies, and proposes ethical considerations for the appropriate exposure to virtual reality and physiological sensor technology.Results show that the use of biofeedback inputs in novel interaction techniques, vary in complexity and functionality depending on the type of measurements used. They also showed that biofeedback interaction can positively affect player experience since it allows games and virtual reality applications to synchronize with player physiology, making of playing games a personalized experience. Results highlighted that biofeedback interaction can significantly affect player performance, being influenced by the interaction complexity and the reliability of the sensor technology used.
  •  
6.
  • Niehorster, Diederick C., et al. (författare)
  • The impact of slippage on the data quality of head-worn eye trackers
  • 2020
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • Mobile head-worn eye trackers allow researchers to record eye-movement data as participants freely move around and interact with their surroundings. However, participant behavior may cause the eye tracker to slip on the participant’s head, potentially strongly affecting data quality. To investigate how this eye-tracker slippage affects data quality, we designed experiments in which participants mimic behaviors that can cause a mobile eye tracker to move. Specifically, we investigated data quality when participants speak, make facial expressions, and move the eye tracker. Four head-worn eye-tracking setups were used: (i) Tobii Pro Glasses 2 in 50 Hz mode, (ii) SMI Eye Tracking Glasses 2.0 60 Hz, (iii) Pupil-Labs’ Pupil in 3D mode, and (iv) Pupil-Labs’ Pupil with the Grip gaze estimation algorithm as implemented in the EyeRecToo software. Our results show that whereas gaze estimates of the Tobii and Grip remained stable when the eye tracker moved, the other systems exhibited significant errors (0.8–3.1∘ increase in gaze deviation over baseline) even for the small amounts of glasses movement that occurred during the speech and facial expressions tasks. We conclude that some of the tested eye-tracking setups may not be suitable for investigating gaze behavior when high accuracy is required, such as during face-to-face interaction scenarios. We recommend that users of mobile head-worn eye trackers perform similar tests with their setups to become aware of its characteristics. This will enable researchers to design experiments that are robust to the limitations of their particular eye-tracking setup.
  •  
7.
  • Santini, Thiago, et al. (författare)
  • Get a grip : Slippage-robust and glint-free gaze estimation for real-time pervasive head-mounted eye tracking
  • 2019
  • Ingår i: Proceedings - ETRA 2019 : 2019 ACM Symposium On Eye Tracking Research and Applications - 2019 ACM Symposium On Eye Tracking Research and Applications. - New York, NY, USA : ACM. - 9781450367097
  • Konferensbidrag (refereegranskat)abstract
    • A key assumption conventionally made by flexible head-mounted eye-tracking systems is often invalid: The eye center does not remain stationary w.r.t. the eye camera due to slippage. For instance, eye-tracker slippage might happen due to head acceleration or explicit adjustments by the user. As a result, gaze estimation accuracy can be significantly reduced. In this work, we propose Grip, a novel gaze estimation method capable of instantaneously compensating for eye-tracker slippage without additional hardware requirements such as glints or stereo eye camera setups. Grip was evaluated using previously collected data from a large scale unconstrained pervasive eye-tracking study. Our results indicate significant slippage compensation potential, decreasing average participant median angular offset by more than 43% w.r.t. a non-slippage-robust gaze estimation method. A reference implementation of Grip was integrated into EyeRecToo, an open-source hardware-agnostic eye-tracking software, thus making it readily accessible for multiple eye trackers (Available at: www.ti.uni-tuebingen.de/perception).
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-7 av 7

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy