SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Niehorster Diederick) "

Sökning: WFRF:(Niehorster Diederick)

  • Resultat 1-50 av 88
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Alce, Günter, et al. (författare)
  • Design and Evaluation of Three User Interfaces for Detecting Unmanned Aerial Vehicles Using Virtual Reality
  • 2022
  • Ingår i: Virtual Reality and Mixed Reality - 19th EuroXR International Conference, EuroXR 2022, Proceedings. - Cham : Springer International Publishing. - 0302-9743 .- 1611-3349. - 9783031162336 ; 13484 LNCS, s. 36-49
  • Konferensbidrag (refereegranskat)abstract
    • Regulations restrict UAVs to fly only within direct view of the pilot, limiting their ability to support critical societal functions. One potential way to move beyond this limitation is by placing a 360-degree camera on the vehicle and using its feed to provide operators with a view that is the equivalent to being on the vehicle. This necessitates a cockpit user interface (UI) that amongst other things highlights flying objects, so that collision with these can be avoided. In this paper, virtual reality (VR) was used to build a prototype of such a system and evaluate three UIs that were designed to facilitate detecting aerial. Conclusions are drawn regarding which UI features support detection performance and a positive user experience.
  •  
2.
  • Alce, Günter, et al. (författare)
  • Using augmented reality to train flow patterns for pilot students - An explorative study
  • 2020
  • Ingår i: Augmented Reality, Virtual Reality, and Computer Graphics - 7th International Conference, AVR 2020, Proceedings. - Cham : Springer International Publishing. - 1611-3349 .- 0302-9743. - 9783030584641 - 9783030584658 ; 12242, s. 215-231
  • Konferensbidrag (refereegranskat)abstract
    • Today, just as in the early days of flying, much emphasis is put on the pilot student’s flight training before flying a real commercial aircraft. In the early stages of a pilot student’s education, they must, for example, learn different operating procedures known as flow patterns using very basic tools, such as exhaustive manuals and a so-called paper tiger. In this paper, we present a first design of a virtual and interactive paper tiger using augmented reality (AR), and perform an evaluation of the developed prototype. We evaluated the prototype on twenty-seven pilot students at the Lund University School of Aviation (LUSA), to explore the possibilities and technical advantages that AR can offer, in particular the procedure that is performed before takeoff. The prototype got positive results on perceived workload, and in remembering the flow pattern. The main contribution of this paper is to elucidate knowledge about the value of using AR for training pilot students.
  •  
3.
  • Bertilsson, Johan, et al. (författare)
  • Stress Levels Escalate When Repeatedly Performing Tasks Involving Threats
  • 2019
  • Ingår i: Frontiers in Psychology. - : Frontiers Media SA. - 1664-1078. ; 10, s. 1562-1562
  • Tidskriftsartikel (refereegranskat)abstract
    • Police work may include performing repeated tasks under the influence of psychological stress, which can affect perceptual, cognitive and motor performance. However, it is largely unknown how repeatedly performing stressful tasks physically affect police officers in terms of heart rate and pupil diameter properties. Psychological stress is commonly assessed by monitoring the changes in these biomarkers. Heart rate and pupil diameter was measured in 12 male police officers when performing a sequence of four stressful tasks, each lasting between 20 and 130 s. The participants were first placed in a dimly illuminated anteroom before allowed to enter a brightly lit room where a scenario was played out. After each task was performed, the participants returned to the anteroom for about 30 s before performing the next sequential task. Performing a repeated sequence of stressful tasks caused a significant increase in heart rate (p = 0.005). The heart rate started to increase already before entering the scenario room and was significantly larger just after starting the task than just before starting the task (p < 0.001). This pattern was more marked during the first tasks (p < 0.001). Issuance of a verbal "abort" command which terminated the tasks led to a significant increase of heart rate (p = 0.002), especially when performing the first tasks (p = 0.002). The pupil diameter changed significantly during the repeated tasks during all phases but in a complex pattern where the pupil diameter reached a minimum during task 2 followed by an increase during tasks 3 and 4 (p ≤ 0.020). During the initial tasks, the pupil size (p = 0.014) increased significantly. The results suggest that being repeatedly exposed to stressful tasks can produce in itself an escalation of psychological stress, this even prior to being exposed to the task. However, the characteristics of both the heart rate and pupil diameter were complex, thus, the findings highlight the importance of studying the effects and dynamics of different stress-generating factors. Monitoring heart rate was found useful to screen for stress responses, and thus, to be a vehicle for indication if and when rotation of deployed personnel is necessary to avoid sustained high stress exposures.
  •  
4.
  • Bertilsson, Johan, et al. (författare)
  • Towards systematic and objective evaluation of police officer performance in stressful situations
  • 2020
  • Ingår i: Police Practice and Research. - : Informa UK Limited. - 1561-4263 .- 1477-271X. ; 21:6, s. 655-669
  • Tidskriftsartikel (refereegranskat)abstract
    • To ensure a continuous high standard of police units, it is critical to recruit people who perform well in stressful situations. Today, this selection process includes performing a large series of tests, which still may not objectively reveal a person’s capacity to handle a life-threatening situation when subjected to high levels of stress. To obtain more systematic and objective data, 12 police officers were exposed to six scenarios with varying levels of threat while their heart rate and pupil size were monitored. The scenarios were filmed and six expert evaluators assessed the performance of the police officers according to seven predefined criteria. Four of the scenarios included addressing a moderate threat level task and the scenarios were executed in a rapid sequence. Two further scenarios included a familiar firearm drill performed during high and low threat situations. The results showed that there was a large agreement between the experts in how they judged the performance of the police officers (p < 0.001). Performance increased significantly over tasks in four of the seven evaluation criteria (p ≤ 0.037). There was also a significant effect of pupil size (p = 0.004), but not heart rate, when comparing the different sequential scenarios. Moreover, a high level of threat considerably impaired the motor performance of the police officers during the firearms drill (p = 0.002). Finally, the pupil seemed to systematically dilate more when a threat appeared immediately than with a delay in the scenarios (p = 0.007). We conclude that systematic and quantitative judgments from experts provide valuable and reliable information about the performance of participants in realistic and stressful policing scenarios. Furthermore, objective physiological measures of heart rate and pupil size may help to explain and understand why performance sometimes deteriorates.
  •  
5.
  • Byrne, Sean Anthony, et al. (författare)
  • From Lenses to Living Rooms : A Policy Brief on Eye Tracking in XR Before the Impending Boom
  • 2024
  • Ingår i: 2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality, AIxVR 2024. - 9798350372021 ; , s. 90-96
  • Konferensbidrag (refereegranskat)abstract
    • As tech giants such as Apple and Meta invest heavily in Virtual and Augmented Reality (VR/AR) technologies, often collectively termed Extended Reality (XR) devices, a significant societal concern emerges: The use of eye-tracking technology within these devices. Gaze data holds immense value, revealing insights into user attention, health, and cognitive states. This raises substantial concerns over privacy and fairness, with potential risks of targeted ads, unauthorized surveillance, and data re-purposing. As the impact of eye tracking is due to transition from the lab to the broader public, this paper underscores these pivotal issues in a digestible manner to the general audience. To this end, we first outline the eye-tracking data collection process and its potential for user insights. Second, we introduce perspectives from the domain of privacy, emphasizing its significance as a pivotal measure to guard against the improper use of eye-tracking data. Third, we provide a set of guidelines created by researchers actively working within this space. These recommendations are designed to guide policymakers and the general public toward establishing informed, equitable, and privacy-centric standards surrounding these devices.
  •  
6.
  • Byrne, Sean Anthony, et al. (författare)
  • Precise localization of corneal reflections in eye images using deep learning trained on synthetic data
  • Ingår i: Behavior Research Methods. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • We present a deep learning method for accurately localizing the center of a single corneal reflection (CR) in an eye image. Unlike previous approaches, we use a convolutional neural network (CNN) that was trained solely using synthetic data. Using only synthetic data has the benefit of completely sidestepping the time-consuming process of manual annotation that is required for supervised training on real eye images. To systematically evaluate the accuracy of our method, we first tested it on images with synthetic CRs placed on different backgrounds and embedded in varying levels of noise. Second, we tested the method on two datasets consisting of high-quality videos captured from real eyes. Our method outperformed state-of-the-art algorithmic methods on real eye images with a 3-41.5% reduction in terms of spatial precision across data sets, and performed on par with state-of-the-art on synthetic images in terms of spatial accuracy. We conclude that our method provides a precise method for CR center localization and provides a solution to the data availability problem, which is one of the important common roadblocks in the development of deep learning models for gaze estimation. Due to the superior CR center localization and ease of application, our method has the potential to improve the accuracy and precision of CR-based eye trackers.
  •  
7.
  •  
8.
  • De Kloe, Yentl J.R., et al. (författare)
  • Replacing eye trackers in ongoing studies : A comparison of eye‐tracking data quality between the Tobii Pro TX300 and the Tobii Pro Spectrum
  • 2021
  • Ingår i: Infancy. - : Wiley. - 1532-7078 .- 1525-0008.
  • Tidskriftsartikel (refereegranskat)abstract
    • The Tobii Pro TX300 is a popular eye tracker in developmental eye-tracking research, yet it is no longer manufactured. If a TX300 breaks down, it may have to be replaced. The data quality of the replacement eye tracker may differ from that of the TX300, which may affect the experimental outcome measures. This is problematic for longitudinal and multi-site studies, and for researchers replacing eye trackers between studies. We, therefore, ask how the TX300 and its successor, the Tobii Pro Spectrum, compare in terms of eye-tracking data quality. Data quality—operationalized through precision, accuracy, and data loss—was compared between eye trackers for three age groups (around 5-months, 10-months, and 3-years). Precision was better for all gaze position signals obtained with the Spectrum in comparison to the TX300. Accuracy of the Spectrum was higher for the 5-month-old and 10-month-old children. For the three-year-old children, accuracy was similar across both eye trackers. Gaze position signals from the Spectrum exhibited lower proportions of data loss, and the duration of the data loss periods tended to be shorter. In conclusion, the Spectrum produces gaze position signals with higher data quality, especially for the younger infants. Implications for data analysis are discussed.
  •  
9.
  • Dunn, Matt J, et al. (författare)
  • Minimal reporting guideline for research involving eye tracking (2023 edition)
  • Ingår i: Behavior Research Methods. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • A guideline is proposed that comprises the minimum items to be reported in research studies involving an eye tracker and human or non-human primate participant(s). This guideline was developed over a 3-year period using a consensus-based process via an open invitation to the international eye tracking community. This guideline will be reviewed at maximum intervals of 4 years.
  •  
10.
  • Dutceac Segesten, Anamaria, et al. (författare)
  • The Cueing Power of Comments on Social Media : How Disagreement in Facebook Comments Affects User Engagement with News
  • 2022
  • Ingår i: Information, Communication & Society. - 1369-118X. ; 25:8, s. 1115-1134
  • Tidskriftsartikel (refereegranskat)abstract
    • Previous research demonstrates that conflict framing in news articles can influence individuals’ attention, selection, and distribution of news. However, no study has examined whether the valence of social media comment fields can trigger similar effects for news engagement on Facebook. In this mixed-methods study, we combine eye tracking with surveys, and conduct an experiment in which participants (n = 96) were exposed to 20 Facebook news posts from the Swedish tabloid Aftonbladet. Under each post, we presented participants with a pair of real (but anonymized) Facebook comments that were either in agreement or disagreement with one another. We then examined how this manipulation influenced participants’ visual attention to comment fields, their self-reported likelihood to click on the post to read the full story, and their self-reported likelihood to share the news post to their Facebook network. Our results show that comments in disagreement increased users’ visual attention to comments, decreased their likelihood to share the post, and had no effect on their likelihood to read the news article associated with the post. Thus, the presence of disagreement in comments does cue news engagement on Facebook, but the effect is not uniform across different news engagement behaviors. Moreover, engagement with hard versus soft news topics also varied. Disagreement in comments to Facebook posts about soft news topics (Entertainment, Society, and Sports) increased users’ attention to the comments field. In contrast, comment disagreement for hard news topics (Economy and Politics) reduced users’ attention to the comment field, as well as their self-reported likelihood to read the post.
  •  
11.
  • Ekström, Axel G., et al. (författare)
  • Self-imposed Filter Bubbles : Selective Attention and Exposure in Online Search
  • 2022
  • Ingår i: Computers in Human Behavior Reports. - : Elsevier BV. - 2451-9588. ; 7
  • Tidskriftsartikel (refereegranskat)abstract
    • It is commonly assumed that algorithmic curation of search results creates filter bubbles, where users’ beliefs are continually reinforced and opposing views are suppressed. However, empirical evidence has failed to support this hypothesis. Instead, it has been suggested that filter bubbles may result from individuals engaging selectively with information in search engine results pages. However, this “self-imposed filter bubble hypothesis” has remained empirically untested. In this study, we find support for the hypothesis using eye-tracking technology and link selection data. We presented partisan participants (n = 48) with sets of simulated Google Search results, controlling for the ideological leaning of each link. Participants spent more time viewing own-side links than other links (p = .037). In our sample, participants who identified as right-wing exhibited a greater such bias than those that identified as left wing (p < .001). In addition, we found that both liberals and conservatives tended to select own-side links (p < .001). Finally, there was a significant effect of trust, such that links associated with less trusted sources were attended less and selected less often by liberals and conservatives alike (p < .001). Our study challenges the efficacy of policies that aim at combatting filter bubbles by presenting users with an ideologically diverse set of search results.
  •  
12.
  •  
13.
  • Emhardt, Selina N., et al. (författare)
  • What is my teacher talking about? Effects of displaying the teacher’s gaze and mouse cursor cues in video lectures on students’ learning
  • 2022
  • Ingår i: Journal of Cognitive Psychology. - : Informa UK Limited. - 2044-5911 .- 2044-592X. ; 34:7, s. 846-864
  • Tidskriftsartikel (refereegranskat)abstract
    • Eye movement modelling examples (EMME) are instructional videos that display a teacher’s eye movements as “gaze cursor” (e.g. a moving dot) superimposed on the learning task. This study investigated if previous findings on the beneficial effects of EMME would extend to online lecture videos and compared the effects of displaying the teacher’s gaze cursor with displaying the more traditional mouse cursor as a tool to guide learners’ attention. Novices (N = 124) studied a pre-recorded video lecture on how to model business processes in a 2 (mouse cursor absent/present) × 2 (gaze cursor absent/present) between-subjects design. Unexpectedly, we did not find significant effects of the presence of gaze or mouse cursors on mental effort and learning. However, participants who watched videos with the gaze cursor found it easier to follow the teacher. Overall, participants responded positively to the gaze cursor, especially when the mouse cursor was not displayed in the video.
  •  
14.
  • Fransson, Per-Anders, et al. (författare)
  • Exploring the Effects of Deep Brain Stimulation and Vision on Tremor in Parkinson's Disease : Benefits from Objective Methods
  • 2020
  • Ingår i: Journal of NeuroEngineering and Rehabilitation. - : Springer Science and Business Media LLC. - 1743-0003. ; 17:1
  • Tidskriftsartikel (refereegranskat)abstract
    • BACKGROUND: Tremor is a cardinal symptom of Parkinson's disease (PD) that may cause severe disability. As such, objective methods to determine the exact characteristics of the tremor may improve the evaluation of therapy. This methodology study aims to validate the utility of two objective technical methods of recording Parkinsonian tremor and evaluate their ability to determine the effects of Deep Brain Stimulation (DBS) of the subthalamic nucleus and of vision.METHODS: We studied 10 patients with idiopathic PD, who were responsive to L-Dopa and had more than 1 year use of bilateral subthalamic nucleus stimulation. The patients did not have to display visible tremor to be included in the study. Tremor was recorded with two objective methods, a force platform and a 3 dimensional (3D) motion capture system that tracked movements in four key proximal sections of the body (knee, hip, shoulder and head). They were assessed after an overnight withdrawal of anti-PD medications with DBS ON and OFF and with eyes open and closed during unperturbed and perturbed stance with randomized calf vibration, using a randomized test order design.RESULTS: Tremor was detected with the Unified Parkinson's Disease Rating Scale (UPDRS) in 6 of 10 patients but only distally (hands and feet) with DBS OFF. With the force platform and the 3D motion capture system, tremor was detected in 6 of 10 and 7 of 10 patients respectively, mostly in DBS OFF but also with DBS ON in some patients. The 3D motion capture system revealed that more than one body section was usually affected by tremor and that the tremor amplitude was non-uniform, but the frequency almost identical, across sites. DBS reduced tremor amplitude non-uniformly across the body. Visual input mostly reduced tremor amplitude with DBS ON.CONCLUSIONS: Technical recording methods offer objective and sensitive detection of tremor that provide detailed characteristics such as peak amplitude, frequency and distribution pattern, and thus, provide information that can guide the optimization of treatments. Both methods detected the effects of DBS and visual input but the 3D motion system was more versatile in that it could detail the presence and properties of tremor at individual body sections.
  •  
15.
  • Hallberg, Andreas, 1985, et al. (författare)
  • Parsing written language with non-standard grammar : An eye-tracking study of case marking in Arabic
  • 2020
  • Ingår i: Reading and Writing. - : Springer Science and Business Media LLC. - 0922-4777 .- 1573-0905.
  • Tidskriftsartikel (refereegranskat)abstract
    • Morphologically marked case is in Arabic a feature exclusive to the variety of Standard Arabic, with no parallel in the spoken varieties, and it is orthographically marked only on some word classes in specific grammatical situations. In this study we test the hypothesis that readers of Arabic do not parse sentences for case and that orthographically marked case can therefore be removed with no effect on reading. Twenty-nine participants read sentences in which one of the two most frequent types of orthographically marked case was either retained or omitted, while their eye-movements were monitored. The removal of case marking from subjects in the sound masculine plural declension (changing the suffix‑ūn ـون to ‑īn ـين) had no negative effect on gaze duration, regressions out, or go-past time. The removal of case marking form direct objects in the triptote declension (omitting the suffix -an ـاً) did however resulted in an increase in these measures. These results indicate that only some forms of case marking are required in the grammar used by readers for parsing written text.
  •  
16.
  • Hessels, Roy S, et al. (författare)
  • Eye contact avoidance in crowds : A large wearable eye-tracking study
  • 2022
  • Ingår i: Attention, Perception & Psychophysics. - : Springer Science and Business Media LLC. - 1943-3921 .- 1943-393X. ; 84:8, s. 2623-2640
  • Tidskriftsartikel (refereegranskat)abstract
    • Eye contact is essential for human interactions. We investigated whether humans are able to avoid eye contact while navigating crowds. At a science festival, we fitted 62 participants with a wearable eye tracker and instructed them to walk a route. Half of the participants were further instructed to avoid eye contact. We report that humans can flexibly allocate their gaze while navigating crowds and avoid eye contact primarily by orienting their head and eyes towards the floor. We discuss implications for crowd navigation and gaze behavior. In addition, we address a number of issues encountered in such field studies with regard to data quality, control of the environment, and participant adherence to instructions. We stress that methodological innovation and scientific progress are strongly interrelated.
  •  
17.
  •  
18.
  • Hessels, Roy S., et al. (författare)
  • Is the eye-movement field confused about fixations and saccades? : A survey among 124 researchers
  • 2018
  • Ingår i: Royal Society Open Science. - : The Royal Society. - 2054-5703. ; 5:8, s. 1-23
  • Tidskriftsartikel (refereegranskat)abstract
    • Eye movements have been extensively studied in a wide range of research fields. While new methods such as mobile eye tracking and eye tracking in virtual/augmented realities are emerging quickly, the eye-movement terminology has scarcely been revised. We assert that this may cause confusion about two of the main concepts: fixations and saccades. In this study, we assessed the definitions of fixations and saccades held in the eye-movement field, by surveying 124 eye-movement researchers. These eye-movement researchers held a variety of definitions of fixations and saccades, of which the breadth seems even wider than what is reported in the literature. Moreover, these definitions did not seem to be related to researcher background or experience. We urge researchers to make their definitions more explicit by specifying all the relevant components of the eye movement under investigation: (i) the oculomotor component: e.g. whether the eye moves slow or fast; (ii) the functional component: what purposes does the eye movement (or lack thereof) serve; (iii) the coordinate system used: relative to what does the eye move; (iv) the computational definition: how is the event represented in the eye-tracker signal. This should enable eye-movement researchers from different fields to have a discussion without misunderstandings.
  •  
19.
  •  
20.
  • Hessels, Roy S., et al. (författare)
  • Noise-robust fixation detection in eye movement data : Identification by two-means clustering (I2MC)
  • 2017
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 49:5, s. 1802-1823
  • Tidskriftsartikel (refereegranskat)abstract
    • Eye-tracking research in infants and older children has gained a lot of momentum over the last decades. Although eye-tracking research in these participant groups has become easier with the advance of the remote eye-tracker, this often comes at the cost of poorer data quality than in research with well-trained adults (Hessels, Andersson, Hooge, Nyström, & Kemner Infancy, 20, 601-633, 2015; Wass, Forssman, & Leppänen Infancy, 19, 427-460, 2014). Current fixation detection algorithms are not built for data from infants and young children. As a result, some researchers have even turned to hand correction of fixation detections (Saez de Urabain, Johnson, & Smith Behavior Research Methods, 47, 53-72, 2015). Here we introduce a fixation detection algorithm-identification by two-means clustering (I2MC)-built specifically for data across a wide range of noise levels and when periods of data loss may occur. We evaluated the I2MC algorithm against seven state-of-the-art event detection algorithms, and report that the I2MC algorithm's output is the most robust to high noise and data loss levels. The algorithm is automatic, works offline, and is suitable for eye-tracking data recorded with remote or tower-mounted eye-trackers using static stimuli. In addition to application of the I2MC algorithm in eye-tracking research with infants, school children, and certain patient groups, the I2MC algorithm also may be useful when the noise and data loss levels are markedly different between trials, participants, or time points (e.g., longitudinal research).
  •  
21.
  • Hessels, Roy S., et al. (författare)
  • Task-related gaze behaviour in face-to-face dyadic collaboration : Toward an interactive theory?
  • 2023
  • Ingår i: Visual Cognition. - 1350-6285. ; 31:4, s. 291-313
  • Tidskriftsartikel (refereegranskat)abstract
    • Visual routines theory posits that vision is critical for guiding sequential actions in the world. Most studies on the link between vision and sequential action have considered individual agents, while substantial human behaviour is characterized by multi-party interaction. Here, the actions of each person may affect what the other can subsequently do. We investigated task execution and gaze allocation of 19 dyads completing a Duplo-model copying task together, while wearing the Pupil Invisible eye tracker. We varied whether all blocks were visible to both participants, and whether verbal communication was allowed. For models in which not all blocks were visible, participants seemed to coordinate their gaze: The distance between the participants' gaze positions was smaller and dyads looked longer at the model concurrently than for models in which all blocks were visible. This was most pronounced when verbal communication was allowed. We conclude that the way the collaborative task was executed depended both on whether visual information was available to both persons, and how communication took place. Modelling task structure and gaze allocation for human-human and human-robot collaboration thus requires more than the observable behaviour of either individual. We discuss whether an interactive visual routines theory ought to be pursued.
  •  
22.
  •  
23.
  • Holmqvist, Kenneth, et al. (författare)
  • Data quality in eye trackers: Signal resolution
  • 2018
  • Ingår i: Journal of Eye Movement Research. - : University of Bern. - 1995-8692.
  • Konferensbidrag (refereegranskat)abstract
    • This document contains the abstracts for the 2018 Scandinavian Workshop on Applied Eye Tracking (SWAET 2018) which was held at Copenhagen Business School, Denmark, 23 to 24 August, 2018..
  •  
24.
  • Holmqvist, Kenneth, et al. (författare)
  • Eye tracking : empirical foundations for a minimal reporting guideline
  • 2023
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 55:1, s. 364-416
  • Tidskriftsartikel (refereegranskat)abstract
    • In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines of any study involving an eye tracker. We compare this empirical foundation to five existing reporting guidelines and to a database of 207 published eye-tracking studies. We find that reporting guidelines vary substantially and do not match with actual reporting practices. We end by deriving a minimal, flexible reporting guideline based on empirical research (Section "An empirically based minimal reporting guideline").
  •  
25.
  • Hooge, Ignace, et al. (författare)
  • Is human classification by experienced untrained observers a gold standard in fixation detection?
  • 2018
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 50:5, s. 1864-1881
  • Tidskriftsartikel (refereegranskat)abstract
    • Despite early reports and the contemporary consensus on microsaccades as purely binocular phenomena, recent work has proposed not only the existence of monocular microsaccades, but also that they serve functional purposes. We take a critical look at the detection of monocular microsaccades from a signal perspective, using raw data and a state-of-the-art, video-based eye tracker. In agreement with previous work, monocular detections were present in all participants using a standard microsaccade detection algorithm. However, a closer look at the raw data invalidates the vast majority of monocular detections. These results again raise the question of the existence of monocular microsaccades, as well as the need for improved methods to study small eye movements recorded with video-based eye trackers.
  •  
26.
  • Hooge, Ignace T. C., et al. (författare)
  • Fixation classification: how to merge and select fixation candidates
  • 2022
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 54:6, s. 2765-2776
  • Tidskriftsartikel (refereegranskat)abstract
    • Eye trackers are applied in many research fields (e.g., cognitive science, medicine, marketing research). To give meaning to the eye-tracking data, researchers have a broad choice of classification methods to extract various behaviors (e.g., saccade, blink, fixation) from the gaze signal. There is extensive literature about the different classification algorithms. Surprisingly, not much is known about the effect of fixation and saccade selection rules that are usually (implicitly) applied. We want to answer the following question: What is the impact of the selection-rule parameters (minimal saccade amplitude and minimal fixation duration) on the distribution of fixation durations? To answer this question, we used eye-tracking data with high and low quality and seven different classification algorithms. We conclude that selection rules play an important role in merging and selecting fixation candidates. For eye-tracking data with good-to-moderate precision (RMSD < 0.5∘), the classification algorithm of choice does not matter too much as long as it is sensitive enough and is followed by a rule that selects saccades with amplitudes larger than 1.0∘ and a rule that selects fixations with duration longer than 60 ms. Because of the importance of selection, researchers should always report whether they performed selection and the values of their parameters.
  •  
27.
  • Hooge, Ignace T C, et al. (författare)
  • How robust are wearable eye trackers to slow and fast head and body movements?
  • 2023
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 55:8
  • Tidskriftsartikel (refereegranskat)abstract
    • How well can modern wearable eye trackers cope with head and body movement? To investigate this question, we asked four participants to stand still, walk, skip, and jump while fixating a static physical target in space. We did this for six different eye trackers. All the eye trackers were capable of recording gaze during the most dynamic episodes (skipping and jumping). The accuracy became worse as movement got wilder. During skipping and jumping, the biggest error was 5.8 ∘. However, most errors were smaller than 3 ∘. We discuss the implications of decreased accuracy in the context of different research scenarios.
  •  
28.
  •  
29.
  • Hooge, Ignace T C, et al. (författare)
  • Large eye-head gaze shifts measured with a wearable eye tracker and an industrial camera
  • Ingår i: Behavior Research Methods. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • We built a novel setup to record large gaze shifts (up to 140[Formula: see text]). The setup consists of a wearable eye tracker and a high-speed camera with fiducial marker technology to track the head. We tested our setup by replicating findings from the classic eye-head gaze shift literature. We conclude that our new inexpensive setup is good enough to investigate the dynamics of large eye-head gaze shifts. This novel setup could be used for future research on large eye-head gaze shifts, but also for research on gaze during e.g., human interaction. We further discuss reference frames and terminology in head-free eye tracking. Despite a transition from head-fixed eye tracking to head-free gaze tracking, researchers still use head-fixed eye movement terminology when discussing world-fixed gaze phenomena. We propose to use more specific terminology for world-fixed phenomena, including gaze fixation, gaze pursuit, and gaze saccade.
  •  
30.
  • Hooge, Ignace T C, et al. (författare)
  • The pupil-size artefact (PSA) across time, viewing direction, and different eye trackers
  • 2021
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 53:5, s. 1986-2006
  • Tidskriftsartikel (refereegranskat)abstract
    • The pupil size artefact (PSA) is the gaze deviation reported by an eye tracker during pupil size changes if the eye does not rotate. In the present study, we ask three questions: 1) how stable is the PSA over time, 2) does the PSA depend on properties of the eye tracker set up, and 3) does the PSA depend on the participants' viewing direction? We found that the PSA is very stable over time for periods as long as 1 year, but may differ between participants. When comparing the magnitude of the PSA between eye trackers, we found the magnitude of the obtained PSA to be related to the direction of the eye-tracker-camera axis, suggesting that the angle between the participants' viewing direction and the camera axis affects the PSA. We then investigated the PSA as a function of the participants' viewing direction. The PSA was non-zero for viewing direction 0∘ and depended on the viewing direction. These findings corroborate the suggestion by Choe et al. (Vision Research 118(6755):48-59, 2016), that the PSA can be described by an idiosyncratic and a viewing direction-dependent component. Based on a simulation, we cannot claim that the viewing direction-dependent component of the PSA is caused by the optics of the cornea.
  •  
31.
  • Kadish, David, et al. (författare)
  • Towards Situation Awareness and Attention Guidance in a Multiplayer Environment using Augmented Reality and Carcassonne
  • 2022
  • Ingår i: CHI PLAY 2022 - Extended Abstracts of the 2022 Annual Symposium on Computer-Human Interaction in Play. - New York, NY, USA : ACM. - 9781450392112 ; , s. 133-139
  • Konferensbidrag (refereegranskat)abstract
    • Augmented reality (AR) games are a rich environment for researching and testing computational systems that provide subtle user guidance and training. In particular computer systems that aim to augment a user's situation awareness benefit from the range of sensors and computing power available in AR headsets. The main focus of this work-in-progress paper is the introduction of the concept of the individualized Situation Awareness-based Attention Guidance (SAAG) system used to increase humans' situating awareness and the augmented reality version of the board game Carcassonne for validation and evaluation of SAAG. Furthermore, we present our initial work in developing the SAAG pipeline, the generation of game state encodings, the development and training of a game AI, and the design of situation modeling and eye-tracking processes.
  •  
32.
  • Kok, Ellen M., et al. (författare)
  • The effects of gaze-display feedback on medical students’ self-monitoring and learning in radiology
  • 2024
  • Ingår i: Advances in Health Sciences Education. - 1382-4996.
  • Tidskriftsartikel (refereegranskat)abstract
    • Self-monitoring is essential for effectively regulating learning, but difficult in visual diagnostic tasks such as radiograph interpretation. Eye-tracking technology can visualize viewing behavior in gaze displays, thereby providing information about visual search and decision-making. We hypothesized that individually adaptive gaze-display feedback improves posttest performance and self-monitoring of medical students who learn to detect nodules in radiographs. We investigated the effects of: (1) Search displays, showing which part of the image was searched by the participant; and (2) Decision displays, showing which parts of the image received prolonged attention in 78 medical students. After a pretest and instruction, participants practiced identifying nodules in 16 cases under search-display, decision-display, or no feedback conditions (n = 26 per condition). A 10-case posttest, without feedback, was administered to assess learning outcomes. After each case, participants provided self-monitoring and confidence judgments. Afterward, participants reported on self-efficacy, perceived competence, feedback use, and perceived usefulness of the feedback. Bayesian analyses showed no benefits of gaze displays for post-test performance, monitoring accuracy (absolute difference between participants’ estimated and their actual test performance), completeness of viewing behavior, self-efficacy, and perceived competence. Participants receiving search-displays reported greater feedback utilization than participants receiving decision-displays, and also found the feedback more useful when the gaze data displayed was precise and accurate. As the completeness of search was not related to posttest performance, search displays might not have been sufficiently informative to improve self-monitoring. Information from decision displays was rarely used to inform self-monitoring. Further research should address if and when gaze displays can support learning.
  •  
33.
  • Kuang, Peng, et al. (författare)
  • Applying Machine Learning to Gaze Data in Software Development: a Mapping Study
  • 2023
  • Ingår i: The 11th International Workshop on Eye Movements in Programming.
  • Konferensbidrag (refereegranskat)abstract
    • Eye tracking has been used as part of software engineering and computer science research for a long time, and during this time new techniques for machine learning (ML) have emerged. Some of those techniques are applicable to the analysis of eye-tracking data, and to some extent have been applied. However, there is no structured summary available on which ML techniques are used for analysis in different types of eye-tracking research studies. In this paper, our objective is to summarize the research literature with respect to the application of ML techniques to gaze data in the field of software engineering. To this end, we have conducted a systematic mapping study, where research articles are identified through a search in academic databases and analyzed qualitatively. After identifying 10 relevant articles, we found that the most common software development activity studied so far with eye-tracking and ML is program comprehension, and Support Vector Machines and Decision Trees are the most commonly used ML techniques. We further report on limitations and challenges reported in the literature and opportunities for future work.
  •  
34.
  • Kuang, Peng, et al. (författare)
  • Toward Gaze-assisted Developer Tools
  • 2023
  • Ingår i: Proceedings of the 45th IEEE/ACM International Conference on Software Engineering: New Ideas and Emerging Results (ICSE-NIER). - 9798350300390 ; , s. 49-54
  • Konferensbidrag (refereegranskat)abstract
    • Many crucial activities in software development are linked to gaze and can potentially benefit from gaze-assisted developer tools. However, despite the maturity of eye trackers and the potential for such tools, we see very few studies of practitioners. Here, we present a systematic mapping study to examine recent developments in the field with a focus on the experimental setup of eye-tracking studies in software engineering research. We identify two gaps regarding studies of practitioners in realistic settings and three challenges in existing experimental setups. We present six recommendations for how to steer the research community toward gaze-assisted developer tools that can benefit practitioners.
  •  
35.
  • Maquiling, Virmarie, et al. (författare)
  • V-ir-Net : A Novel Neural Network for Pupil and Corneal Reflection Detection trained on Simulated Light Distributions
  • 2023
  • Ingår i: MobileHCI '23 Companion : Proceedings of the 25th International Conference on Mobile Human-Computer Interaction - Proceedings of the 25th International Conference on Mobile Human-Computer Interaction. - 9781450399241 ; , s. 1-7
  • Konferensbidrag (refereegranskat)abstract
    • Deep learning has shown promise for gaze estimation in Virtual Reality (VR) and other head-mounted applications, but such models are hard to train due to lack of available data. Here we introduce a novel method to train neural networks for gaze estimation using synthetic images that model the light distributions captured in a P-CR setup. We tested our model on a dataset of real eye images from a VR setup, achieving 76% accuracy which is close to the state-of-the-art model which was trained on the dataset itself. The localization error for CRs was 1.56 pixels and 2.02 pixels for the pupil, which is on par with state-of-the-art. Our approach allowed inference on the whole dataset without sacrificing data for model training. Our method provides a cost-efficient and lightweight training alternative, eliminating the need for hand-labeled data. It offers flexible customization, e.g. adapting to different illuminator configurations, with minimal code changes.
  •  
36.
  •  
37.
  • McCabe, Alan, et al. (författare)
  • Influencing Code Reading Through Beacons: an Eye-Tracking Study
  • 2023
  • Ingår i: Proceedings of the 34th Annual Workshop of the Psychology of Programming Interest Group.
  • Konferensbidrag (refereegranskat)abstract
    • When interacting with other humans, we attempt to develop a shared understanding using various means. One such method is through our eyes: when someone is looking at something, we understand that their attention is focused on that object. In this work, we present the results of an eye-tracking study built upon the Progger tool, in which we used additional code highlighting in an attempt to influence the gaze behaviour of a human programmer, thereby focusing their attention. We found that though it is possible to draw attention towards areas of particular interest to the compiler, this has no apparent effect upon performance when confronted with a bug-finding code comprehension task. We conclude that although this strategy may be of use in the future when attempting to humanise the process of programming, further research is required to establish the efficacy of such interventions.
  •  
38.
  • Niehorster, Diederick C., et al. (författare)
  • Characterizing gaze position signals and synthesizing noise during fixations in eye-tracking data
  • 2020
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-351X .- 1554-3528. ; 52:6, s. 2515-2534
  • Tidskriftsartikel (refereegranskat)abstract
    • The magnitude of variation in the gaze position signals recorded by an eye tracker, also known as its precision, is an important aspect of an eye tracker’s data quality. However, data quality of eye-tracking signals is still poorly understood. In this paper, we therefore investigate the following: (1) How do the various available measures characterizing eye-tracking data during fixation relate to each other? (2) How are they influenced by signal type? (3) What type of noise should be used to augment eye-tracking data when evaluating eye-movement analysis methods? To support our analysis, this paper presents new measures to characterize signal type and signal magnitude based on RMS-S2S and STD, two established measures of precision. Simulations are performed to investigate how each of these measures depends on the number of gaze position samples over which they are calculated, and to reveal how RMS-S2S and STD relate to each other and to measures characterizing the temporal spectrum composition of the recorded gaze position signal. Further empirical investigations were performed using gaze position data recorded with five eye trackers from human and artificial eyes. We found that although the examined eye trackers produce gaze position signals with different characteristics, the relations between precision measures derived from simulations are borne out by the data. We furthermore conclude that data with a range of signal type values should be used to assess the robustness of eye-movement analysis methods. We present a method for generating artificial eye-tracker noise of any signal type and magnitude.
  •  
39.
  • Niehorster, Diederick C, et al. (författare)
  • Concurrent manual tracking enhances pursuit eye movements
  • 2015
  • Ingår i: Journal of Eye Movement Research. - : University of Bern. - 1995-8692. ; 8:4, s. 31-31
  • Konferensbidrag (refereegranskat)abstract
    • This document contains all abstracts of the 18th European Conference on Eye Movements, August 16-21 2015 in Vienna, Austria
  •  
40.
  •  
41.
  •  
42.
  • Niehorster, Diederick C, et al. (författare)
  • GlassesValidator : A data quality tool for eye tracking glasses
  • Ingår i: Behavior Research Methods. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • According to the proposal for a minimum reporting guideline for an eye tracking study by Holmqvist et al. (2022), the accuracy (in degrees) of eye tracking data should be reported. Currently, there is no easy way to determine accuracy for wearable eye tracking recordings. To enable determining the accuracy quickly and easily, we have produced a simple validation procedure using a printable poster and accompanying Python software. We tested the poster and procedure with 61 participants using one wearable eye tracker. In addition, the software was tested with six different wearable eye trackers. We found that the validation procedure can be administered within a minute per participant and provides measures of accuracy and precision. Calculating the eye-tracking data quality measures can be done offline on a simple computer and requires no advanced computer skills.
  •  
43.
  • Niehorster, Diederick C., et al. (författare)
  • GlassesViewer : Open-source software for viewing and analyzing data from the Tobii Pro Glasses 2 eye tracker
  • 2020
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528.
  • Tidskriftsartikel (refereegranskat)abstract
    • We present GlassesViewer, open-source software for viewing and analyzing eye-tracking data of the Tobii Pro Glasses 2 head-mounted eye tracker as well as the scene and eye videos and other data streams (pupil size, gyroscope, accelerometer, and TTL input) that this headset can record. The software provides the following functionality written in MATLAB: (1) a graphical interface for navigating the study- and recording structure produced by the Tobii Glasses 2; (2) functionality to unpack, parse, and synchronize the various data and video streams comprising a Glasses 2 recording; and (3) a graphical interface for viewing the Glasses 2’s gaze direction, pupil size, gyroscope and accelerometer time-series data, along with the recorded scene and eye camera videos. In this latter interface, segments of data can furthermore be labeled through user-provided event classification algorithms or by means of manual annotation. Lastly, the toolbox provides integration with the GazeCode tool by Benjamins et al. (2018), enabling a completely open-source workflow for analyzing Tobii Pro Glasses 2 recordings.
  •  
44.
  • Niehorster, Diederick C, et al. (författare)
  • Is apparent fixational drift in eye-tracking data due to filters or eyeball rotation?
  • 2021
  • Ingår i: Behavior Research Methods. - : Springer Science and Business Media LLC. - 1554-3528. ; 53:1, s. 311-324
  • Tidskriftsartikel (refereegranskat)abstract
    • Eye trackers are sometimes used to study the miniature eye movements such as drift that occur while observers fixate a static location on a screen. Specifically, analysis of such eye-tracking data can be performed by examining the temporal spectrum composition of the recorded gaze position signal, allowing to assess its color. However, not only rotations of the eyeball but also filters in the eye tracker may affect the signal's spectral color. Here, we therefore ask whether colored, as opposed to white, signal dynamics in eye-tracking recordings reflect fixational eye movements, or whether they are instead largely due to filters. We recorded gaze position data with five eye trackers from four pairs of human eyes performing fixation sequences, and also from artificial eyes. We examined the spectral color of the gaze position signals produced by the eye trackers, both with their filters switched on, and for unfiltered data. We found that while filtered data recorded from both human and artificial eyes were colored for all eye trackers, for most eye trackers the signal was white when examining both unfiltered human and unfiltered artificial eye data. These results suggest that color in the eye-movement recordings was due to filters for all eye trackers except the most precise eye tracker where it may partly reflect fixational eye movements. As such, researchers studying fixational eye movements should be careful to examine the properties of the filters in their eye tracker to ensure they are studying eyeball rotation and not filter properties.
  •  
45.
  • Niehorster, Diederick C, et al. (författare)
  • Microsaccade detection using pupil and corneal reflection signals
  • 2018
  • Ingår i: Journal of Eye Movement Research. - : University of Bern. - 1995-8692.
  • Konferensbidrag (refereegranskat)abstract
    • This document contains the abstracts for the 2018 Scandinavian Workshop on Applied Eye Tracking (SWAET 2018) which was held at Copenhagen Business School, Denmark, 23 to 24 August, 2018..
  •  
46.
  • Niehorster, Diederick C, et al. (författare)
  • Microsaccade detection using pupil and corneal reflection signals
  • 2018
  • Ingår i: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications : ETRA '18 - ETRA '18. - New York, NY, USA : ACM. - 9781450357067
  • Konferensbidrag (refereegranskat)abstract
    • In contemporary research, microsaccade detection is typically performed using the calibrated gaze-velocity signal acquired from a video-based eye tracker. To generate this signal, the pupil and corneal reflection (CR) signals are subtracted from each other and a differentiation filter is applied, both of which may prevent small microsaccades from being detected due to signal distortion and noise amplification. We propose a new algorithm where microsaccades are detected directly from uncalibrated pupil-, and CR signals. It is based on detrending followed by windowed correlation between pupil and CR signals. The proposed algorithm outperforms the most commonly used algorithm in the field (Engbert & Kliegl, 2003), in particular for small amplitude microsaccades that are difficult to see in the velocity signal even with the naked eye. We argue that it is advantageous to consider the most basic output of the eye tracker, i.e. pupil-, and CR signals, when detecting small microsaccades.
  •  
47.
  • Niehorster, Diederick C., et al. (författare)
  • No evidence of conditioning of pupillary constriction despite overtraining
  • 2022
  • Ingår i: PeerJ. - : PeerJ. - 2167-8359. ; 10
  • Tidskriftsartikel (refereegranskat)abstract
    • Eyeblink conditioning is the most popular paradigm for studying classical conditioning in humans. But the fact that eyelids are under voluntary control means it is ultimately impossible to ascertain whether a blink response is ‘conditioned’ or a timed ‘voluntary’ blink response. In contrast, the pupillary response is an autonomic response, not under voluntary control. By conditioning the pupillary response, one might avoid potential volition-related confounds. Several attempts have been made to condition the pupillary constriction and dilation responses, with the earliest published attempts dating back to the beginning of the 20th century. While a few early studies reported successful conditioning of pupillary constriction, later studies have failed to replicate this. The apparatus for recording pupil size, the type of stimuli used and the interval between the stimuli has varied in previous attempts—which may explain the inconsistent results. Moreover, measuring the pupil size used to be cumbersome compared with today when an eyetracker can continuously measure pupil size non-invasively. Here we used an eyetracker to test whether it is possible to condition the autonomic pupillary constriction response by pairing a tone (CS) and a light (US) with a 1s CS-US interval. Unlike in previous studies, our subjects went through multiple training sessions to ensure that any potential lack of conditioning would not be due to too little training. A total of 10 participants went through 2–12 conditioning sessions, each lasting approximately 20 min. One training session consisted of 75 paired, tone + light, trials and 25 randomly interspersed CS alone trials. The eyetracker (Tobii Pro Nano), continuously measured participants’ pupil size. To test statistically whether conditioning of the pupillary response occurred we compared the pupil size after the tone on the first session and the last session. The results showed a complete lack of evidence of conditioning. Though the pupil size varied slightly between participants, the size did not change as a result of the training—irrespective of the number of training sessions. The data replicate previous findings that pupillary constriction does not show conditioning. We conclude that it is not possible to condition pupillary constriction—at least not by pairing a tone and a light. One hypothesis is that when pupillary conditioning has been observed in previous studies, it has been mediated by conditioning of an emotional response.
  •  
48.
  • Niehorster, Diederick C (författare)
  • Optic Flow : A History
  • 2021
  • Ingår i: i-Perception. - : SAGE Publications. - 2041-6695. ; 12:6
  • Tidskriftsartikel (refereegranskat)abstract
    • The concept of optic flow, a global pattern of visual motion that is both caused by and signals self-motion, is canonically ascribed to James Gibson's 1950 book "The Perception of the Visual World." There have, however, been several other developments of this concept, chiefly by Gwilym Grindley and Edward Calvert. Based on rarely referenced scientific literature and archival research, this article describes the development of the concept of optic flow by the aforementioned authors and several others. The article furthermore presents the available evidence for interactions between these authors, focusing on whether parts of Gibson's proposal were derived from the work of Grindley or Calvert. While Grindley's work may have made Gibson aware of the geometrical facts of optic flow, Gibson's work is not derivative of Grindley's. It is furthermore shown that Gibson only learned of Calvert's work in 1956, almost a decade after Gibson first published his proposal. In conclusion, the development of the concept of optic flow presents an intriguing example of convergent thought in the progress of science.
  •  
49.
  • Niehorster, Diederick C, et al. (författare)
  • Searching with and against each other
  • 2017
  • Ingår i: Journal of Eye Movement Research. - 1995-8692. ; 10:6, s. 146-146
  • Konferensbidrag (refereegranskat)
  •  
50.
  • Niehorster, Diederick C, et al. (författare)
  • Searching with and against each other : Spatiotemporal coordination of visual search behavior in collaborative and competitive settings
  • 2019
  • Ingår i: Attention, Perception & Psychophysics. - : Springer Science and Business Media LLC. - 1943-3921 .- 1943-393X. ; 81:3, s. 666-683
  • Tidskriftsartikel (refereegranskat)abstract
    • Although in real life people frequently perform visual search together, in lab experiments this social dimension is typically left out. Here, we investigate individual, collaborative and competitive visual search with visualization of search partners' gaze. Participants were instructed to search a grid of Gabor patches while being eye tracked. For collaboration and competition, searchers were shown in real time at which element the paired searcher was looking. To promote collaboration or competition, points were rewarded or deducted for correct or incorrect answers. Early in collaboration trials, searchers rarely fixated the same elements. Reaction times of couples were roughly halved compared with individual search, although error rates did not increase. This indicates searchers formed an efficient collaboration strategy. Overlap, the proportion of dwells that landed on hexagons that the other searcher had already looked at, was lower than expected from simulated overlap of two searchers who are blind to the behavior of their partner. The proportion of overlapping dwells correlated positively with ratings of the quality of collaboration. During competition, overlap increased earlier in time, indicating that competitors divided space less efficiently. Analysis of the entropy of the dwell locations and scan paths revealed that in the competition condition, a less fixed looking pattern was exhibited than in the collaborate and individual search conditions. We conclude that participants can efficiently search together when provided only with information about their partner's gaze position by dividing up the search space. Competing search exhibited more random gaze patterns, potentially reflecting increased interaction between searchers in this condition.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-50 av 88
Typ av publikation
tidskriftsartikel (48)
konferensbidrag (40)
Typ av innehåll
refereegranskat (88)
Författare/redaktör
Niehorster, Diederic ... (86)
Nyström, Marcus (31)
Hessels, Roy S. (24)
Holmqvist, Kenneth (22)
Hooge, Ignace T. C. (22)
Andersson, Richard (12)
visa fler...
Zemblys, Raimondas (9)
Hooge, Ignace (7)
Jarodzka, Halszka (7)
Benjamins, Jeroen S. (7)
Söderberg, Emma (7)
Kasneci, Enkelejda (6)
Cornelissen, Tim (5)
Church, Luke (4)
Kemner, Chantal (4)
Oliva, Manuel (4)
Fransson, Per-Anders (3)
Magnusson, Måns (3)
Byrne, Sean Anthony (3)
van Gog, Tamara (3)
Höst, Martin (2)
Granér, Simon (2)
Alce, Günter (2)
Smoker, Anthony (2)
Olsson, Erik J (2)
Börstler, Jürgen, 19 ... (2)
Bertilsson, Johan (2)
Dahl, Mats (2)
Blignaut, Pieter (2)
Maquiling, Virmarie (2)
McCabe, Alan (2)
Rydenfält, Christofe ... (2)
De Kloe, Yentl J.R. (2)
Dunn, Matt J (2)
Alexander, Robert G (2)
Ettinger, Ulrich (2)
Lee, Helena (2)
Martinez-Conde, Susa ... (2)
Otero-Millan, Jorge (2)
Räihä, Kari-Jouko (2)
Holleman, Gijs A (2)
Valtakari, Niilo V (2)
Kok, Ellen M. (2)
Park, Soon Young (2)
Kuang, Peng (2)
Santini, Thiago (2)
Cornelissen, Tim H W (2)
Niehorster, Diederic ... (2)
Špakov, Oleg (2)
Istance, Howell (2)
visa färre...
Lärosäte
Lunds universitet (86)
Blekinge Tekniska Högskola (2)
Göteborgs universitet (1)
Kungliga Tekniska Högskolan (1)
Uppsala universitet (1)
Högskolan Väst (1)
visa fler...
Linköpings universitet (1)
Malmö universitet (1)
Karolinska Institutet (1)
visa färre...
Språk
Engelska (88)
Forskningsämne (UKÄ/SCB)
Samhällsvetenskap (66)
Naturvetenskap (28)
Teknik (6)
Medicin och hälsovetenskap (5)
Humaniora (5)

År

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy