SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Väljamäe Alexander 1978) "

Sökning: WFRF:(Väljamäe Alexander 1978)

  • Resultat 1-26 av 26
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Larsson, Pontus, 1974, et al. (författare)
  • Auditory-visual perception of room size in virtual environments
  • 2007
  • Ingår i: Proceedings of the 19th International Congress on Acoustics, September 2-7, 2007, Madrid, Spain. - 8487985122 ; , s. PPA-03-001-
  • Konferensbidrag (övrigt vetenskapligt/konstnärligt)abstract
    • It is generally believed that the effectiveness of Virtual Environments (VEs) relies on their ability of faithfully reproducing the multisensory experience of the physical world. An important aspect of this experience is the perception of size and distance. In e.g. an architectural application it is of course of great interest that the user gets the correct impression of room size. However, considering visual perception VE it is yet not fully understood what system parameters control room size. Some investigations on auditory distance perception have been carried out, but there is also an obvious lack of research concerning auditory room size perception. In addition, it is far from understood how audition and vision interact when sensing an indoor environment. The current paper reviews an experiment aimed at exploring aspects of auditory, visual and auditory-visual room size perception in VEs. In line with previous research, it is found that people in general seem to underestimate room size when exposed to a visual VE. It is also shown that there seems to be a tendency to overestimate room size in auditory VEs, and finally that the combination of auditory and visual stimuli allows for a more accurate room size perception.
  •  
2.
  • Riecke, B. E., et al. (författare)
  • Moving Sounds Enhance the Visually-Induced Self-Motion Illusion (Circular Vection) in Virtual Reality
  • 2009
  • Ingår i: ACM Transactions on Applied Perception. - : Association for Computing Machinery (ACM). - 1544-3558 .- 1544-3965. ; 6:2, s. 7 (artno)-
  • Tidskriftsartikel (refereegranskat)abstract
    • While rotating visual and auditory stimuli have long been known to elicit self-motion illusions ("circular vection"), audiovisual interactions have hardly been investigated. Here, two experiments investigated whether visually induced circular vection can be enhanced by concurrently rotating auditory cues that match visual landmarks (e. g., a fountain sound). Participants sat behind a curved projection screen displaying rotating panoramic renderings of a market place. Apart from a no-sound condition, headphone-based auditory stimuli consisted of mono sound, ambient sound, or low-/high-spatial resolution auralizations using generic head-related transfer functions (HRTFs). While merely adding nonrotating (mono or ambient) sound showed no effects, moving sound stimuli facilitated both vection and presence in the virtual environment. This spatialization benefit was maximal for a medium (20 degrees x 15 degrees) FOV, reduced for a larger (54 degrees x 45 degrees) FOV and unexpectedly absent for the smallest (10 degrees x 7.5 degrees) FOV. Increasing auralization spatial fidelity (from low, comparable to five-channel home theatre systems, to high, 5 degrees resolution) provided no further benefit, suggesting a ceiling effect. In conclusion, both self-motion perception and presence can benefit from adding moving auditory stimuli. This has important implications both for multimodal cue integration theories and the applied challenge of building affordable yet effective motion simulators.
  •  
3.
  • Tajadura, Ana, 1979, et al. (författare)
  • Affective multimodal displays: Acoustic spectra modulates perception of auditory-tactile signals
  • 2008
  • Ingår i: International Conference of Auditory Dispaly, ICAD08, Paris, 2008..
  • Konferensbidrag (refereegranskat)abstract
    • Emotional events may interrupt ongoing cognitive processes and automatically grab attention, modulating the subsequentperceptual processes. Hence, emotional eliciting stimuli might effectively be used in warning applications, where a fast and accurate response from users is required. In addition, conveying information through an optimum multisensory combination can lead to a further enhancement of user responses. In the present study we investigated the emotional response to sounds differing in their acoustic spectra, and their influence on speeded detection of auditory-somatosensory stimuli. Higher soundfrequencies resulted in an increase in emotional arousal. We suggest that emotional processes might be responsible for the different auditory-somatosensory integration patterns observed for low and high frequency sounds. The presented results might have important implications for the design of auditory and multisensory warning interfaces.
  •  
4.
  • Tajadura, Ana, 1979, et al. (författare)
  • Auditory–somatosensory multisensory interactions are spatially modulated by stimulated body surface and acoustic spectra
  • 2009
  • Ingår i: Neuropsychologia. - : Elsevier BV. - 0028-3932. ; 47:1, s. 195-203
  • Tidskriftsartikel (refereegranskat)abstract
    • Previous research has provided inconsistent results regarding the spatial modulation of auditory–somatosensory interactions. The present study reports three experiments designed to investigate the nature of these interactions in the space close to the head. Human participants made speeded detection responses to unimodal auditory, somatosensory, or simultaneous auditory–somatosensory stimuli. In Experiment 1, electrocutaneous stimuli were presented to either earlobe, while auditory stimuli were presented from the same versus opposite sides, and from one of two distances (20 vs. 70 cm) from the participant's head. The results demonstrated a spatial modulation of auditory–somatosensory interactions when auditory stimuli were presented from close to the head. In Experiment 2, electrocutaneous stimuli were delivered to the hands, which were placed either close to or far from the head, while the auditory stimuli were again presented at one of two distances. The results revealed that the spatial modulation observed in Experiment 1 was specific to the particular body part stimulated (head) rather than to the region of space (i.e. around the head) where the stimuli were presented. The results of Experiment 3 demonstrate that sounds that contain high-frequency components are particularly effective in eliciting this auditory–somatosensory spatial effect. Taken together, these findings help to resolve inconsistencies in the previous literature and suggest that auditory–somatosensory multisensory integration is modulated by the stimulated body surface and acoustic spectra of the stimuli presented.
  •  
5.
  • Tajadura, Ana, 1979, et al. (författare)
  • Embodied auditory perception: The emotional impact of approaching and receding sound sources.
  • 2010
  • Ingår i: Emotion. - 1528-3542. ; 10:2, s. 216-229
  • Tidskriftsartikel (refereegranskat)abstract
    • Research has shown the existence of perceptual and neural bias toward sounds perceived as sources approaching versus receding a listener. It has been suggested that a greater biological salience of approaching auditory sources may account for these effects. In addition, these effects may hold only for those sources critical for our survival. In the present study, we bring support to these hypotheses by quantifying the emotional responses to different sounds with changing intensity patterns. In 2 experiments, participants were exposed to artificial and natural sounds simulating approaching or receding sources. The auditory-induced emotional effect was reflected in the performance of participants in an emotion-related behavioral task, their self-reported emotional experience, and their physiology (electrodermal activity and facial electromyography). The results of this study suggest that approaching unpleasant sound sources evoke more intense emotional responses in listeners receding ones, whereas such an effect of perceived sound motion does not exist for pleasant or neutral sound sources. The emotional significance attributed to the sound source itself, the loudness of the sound, and loudness change duration seem to be relevant factors in this disparity.
  •  
6.
  • Tajadura, Ana, 1979, et al. (författare)
  • Emotional bias for the perception of rising tones
  • 2008
  • Ingår i: 7th European Conference on Noise Control 2008, EURONOISE 2008; Paris; France; 29 June 2008 through 4 July 2008. - 2226-5147. ; , s. 1597-1602
  • Konferensbidrag (refereegranskat)abstract
    • Sounds with rising or falling intensity are often perceived as approaching or receding sound sources, respectively. Research has shown the existence of biases, both at perceptual and neural levels, in detecting and responding to approaching versus receding sounds. It has been suggested that these effects might account for a greater biological salience of approaching sounds. In the present study we investigated whether this asymmetry could be also explained by emotional theories. Participants were exposed to pairs of stimuli formed by an approaching or a receding sound, followed by a neutral, negative or positive photograph. They were required to make a speeded three-alternative forced choice (3AFC) task regarding how they felt when looking at the photographs. Reaction times (RTs) to this task and self-reported emotional ratings for the sounds were collected. In addition, participants' electrodermal activity and facial electromyography were measured as they listened to the sounds. Participants performed faster in the 3AFC task when photographs were preceded by approaching sounds, especially for photographs with negative content. Both the intensity range and slope of the sounds had a significant effect on RTs. Taken together, these results suggest that approaching sounds have a greater emotional power than receding ones.
  •  
7.
  •  
8.
  • Tajadura, Ana, 1979, et al. (författare)
  • Self-representation in mediated environments: the experience of emotions modulated by auditory-vibrotactile heartbeat
  • 2008
  • Ingår i: CyberPsychology and Behavior. ; 11:1, s. 33-38
  • Tidskriftsartikel (refereegranskat)abstract
    • In 1890, William James hypothesized that emotions are our perception of physiological changes. Many different theories of emotion have emerged since then, but it has been demonstrated that a specifically induced physiological state can influence an individual’s emotional responses to stimuli. In the present study, auditory and/or vibrotactile heartbeat stimuli were presented to participants (N = 24), and the stimuli’s effect on participants’ physiological state and subsequent emotional attitude to affective pictures was measured. In particular, we aimed to investigate the effect of the perceived distance to stimuli on emotional experience. Distant versus close sound reproduction conditions (loudspeakers vs. headphones) were used to identify whether an “embodied” experience can occur in which participants would associate the external heartbeat sound with their own. Vibrotactile stimulation of an experimental chair and footrest was added to magnify the experience. Participants’ peripheral heartbeat signals, selfreported valence (pleasantness) and arousal (activation) ratings for the pictures, and memory performance scores were collected. Heartbeat sounds significantly affected participants’ heartbeat, the emotional judgments of pictures, and their recall. The effect of distance to stimuli was observed in the significant interaction between the spatial location of the heartbeat sound and the vibrotactile stimulation, which was mainly caused by the auditory-vibrotactile interaction in the loudspeakers condition. This interaction might suggest that vibrations transform the far sound condition (sound via loudspeakers) in a close-stimulation condition and support the hypothesis that close sounds are more affective than distant ones. These findings have implications for the design and evaluation of mediated environments.
  •  
9.
  • Tajadura, Ana, 1979, et al. (författare)
  • Spatial modulation of auditory-somatosensory interactions: effects of stimulated body surface and acoustic spectra
  • 2008
  • Ingår i: International Multisensory Research Forum, IMRF08, Hamburg, 2008..
  • Konferensbidrag (övrigt vetenskapligt/konstnärligt)abstract
    • Recent research on auditory-somatosensory interactions has shown contradictory results regarding the spatial modulation. In the present study we report on three experiments on auditory-somatosensory interactions in the region close to the head. Participants made speeded simple detection responses to single auditory, somatosensory or double simultaneous auditory-somatosensory stimuli. In Experiment 1, electrocutaneous stimuli were presented to either earlobe, while auditory stimuli were presented from the same versus different sides, and from one of two distances (20 or 70 cm) from the participant’s head. The results demonstrated a spatial modulation of auditory-somatosensory interactions, especially when auditory stimuli were presented from close to the head. Experiment 2, with electrocutaneous stimuli delivered to the hands (placed either close to, or far from, the participants’ head), suggests that the spatial modulation is dependent on the particular body part stimulated (head) rather than on the region of space (around the head) where the stimuli is presented. Experiment 3 showed that this auditory-somatosensory spatial effect occurs primarily for sounds containing high-frequency components. Taken together, these results suggest that the auditory-somatosensory multisensory integration might be facilitated by stimuli occurring at the same location, and that this integration is modulated by the stimulated body surface and acoustic spectra.
  •  
10.
  • Tajadura, Ana, 1979, et al. (författare)
  • When Room Size Matters: Acoustic Influences on Emotional Responses to Sounds
  • 2010
  • Ingår i: Emotion. - : American Psychological Association (APA). - 1528-3542 .- 1931-1516. ; 10:3, s. 416-422
  • Tidskriftsartikel (refereegranskat)abstract
    • When people hear a sound (a "sound object" or a "sound event") the perceived auditory space around them might modulate their emotional responses to it. Spaces can affect both the acoustic properties of the sound event itself and may also impose boundaries to the actions one can take with respect to this event. Virtual acoustic rooms of different sizes were used in a subjective and psychophysiological experiment that evaluated the influence of the auditory space perception on emotional responses to various sound sources. Participants (N = 20) were exposed to acoustic spaces with sound source positions and room acoustic properties varying across the experimental conditions. The results suggest that, overall, small rooms were considered more pleasant, calmer, and safer than big rooms, although this effect of size seems to disappear when listening to threatening sound sources. Sounds heard behind the listeners tended to be more arousing, and elicited larger physiological changes than sources in front of the listeners. These effects were more pronounced for natural, compared to artificial, sound sources, as confirmed by subjective and physiological measures.
  •  
11.
  • Tajadura, Ana, 1979, et al. (författare)
  • Whole-body vibration influence sound localization in the median plane.
  • 2007
  • Ingår i: 10th Annual International Workshop on Presence, Barcelona, Spain, October 2007..
  • Konferensbidrag (refereegranskat)abstract
    • The perceived location of events occurring in a mediated environment modulates the users’ understanding and involvement in these events. Previous research has shown that when spatially discrepant information is available at various sensory channels, the perceived location of unisensory events might be altered. Tactile “capture” of audition has been reported for lateral sounds. The present study investigates whether auditory localization on the median plane could be altered by concurrent whole-body vibration. Sounds were presented at the front or the back of participants, in isolation or together with vibrations. Subjects made a three alternative forced choice regarding their perceived location of sound (“front”, “back” or “center”). Results indicate that vibrations synchronous with sound affected subjects’ sound localization, significantly reducing the accuracy on front sound localization in favor of “back” and “center” responses. This research might have implications for the design of multimodal environments, especially for those aiming at creating a sense of presence or inducing affective experiences in users.
  •  
12.
  • Tajadura, Ana, 1979, et al. (författare)
  • Whole-body vibration influences on sound localization in the median plane
  • 2010
  • Ingår i: Proceedings of the Institution of Mechanical Engineers, Part D: Journal of Automobile Engineering. - 2041-2991 .- 0954-4070. ; 224:10, s. 1311-1320
  • Tidskriftsartikel (refereegranskat)abstract
    • The present study investigates human multisensory perception of sound and vibration, highlighting its potential impact in the design of novel user interfaces, including those used in the automobile industry. Specifically, the present study investigates whether frontback sound localization could be altered by concurrent whole-body vibration. Previous research has shown that, when auditory and tactile stimuli are presented synchronously but from different positions, the perceived location of the auditory event is mislocalized of the tactile stimulus. Here, sounds were presented at the front or the back of participants, in isolation, or together with vibrations. Participants made a three-alternative forced choice regarding their perceived location of the sounds.Results indicate that front-back sound localization was affected by the presence of concurrent vibrations, which biased the localization of front sounds towards the partipants' rear space. Since the perceived location of events modulates the perceivers' understanding and involvement in these events, the possibility of manipulating the location of sound events using vibrations has a potential for the design of multisensory interfaces such as those included in automotive applications, where it is strongly needed to capture the attention of drivers, to provide navigational information, and to reduce sensory load.
  •  
13.
  • Väljamäe, Alexander, 1978, et al. (författare)
  • AUDIO-VISUAL INTERACTIONS IN DYNAMIC SCENES: IMPLICATIONS FOR MULTISENSORY COMPRESSION
  • 2007
  • Ingår i: Proceeding of ICA'07, Madrid.
  • Konferensbidrag (refereegranskat)abstract
    • New media technologies enrich human capabilities for generation, transmission and representation of audio-visual content. Knowledge about human sensory and cognitive processing is critical to advance in these technologies. Traditionally, media technologies have adopted a unimodal view on human perception using, for example, separate compression modules for audio and visual data streams. Drawing on the neuroscience advances that have revealed the strongly multisensory nature of human perception, we suggest that audio-visual content processing might benefit from adopting a multimodal approach that is tailored to the rules of human multisensory processing. While visual dominance in spatial domain is largely known for static scenes, such as in the famous “ventriloquism effect”, more complex interactions emerge for dynamic audio-visual stimuli. In this paper we first review some studies on “dynamic ventriloquism” effect where visual motion captures the perceived direction of auditory motion. Second, we show how rhythmic sound patterns fill-in temporally degraded visual motion based on recently discovered “auditory-induced visual flash illusion”. Finally, we discuss the implications of these findings for multisensory processing and compression techniques.
  •  
14.
  • Väljamäe, Alexander, 1978 (författare)
  • Auditorily-induced illusory self-motion: A review
  • 2009
  • Ingår i: Brain Research Reviews. - : Elsevier BV. - 0165-0173. ; 61:2, s. 240-255
  • Tidskriftsartikel (refereegranskat)abstract
    • The aim of this paper is to provide a first review of studies related to auditorily-induced self-motion (vection). These studies have been scarce and scattered over the years and over several research communities including clinical audiology, multisensory perception of self-motion and its neural correlates, ergonomics, and virtual reality. The reviewed studies provide evidence that auditorily-induced vection has behavioral, physiological and neural correlates. Although the sound contribution to self-motion perception appears to be weaker than the visual modality, specific acoustic cues appear to be instrumental for a number of domains including posture prosthesis, navigation in unusual gravitoinertial environments (in the air, in space, or underwater), non-visual navigation, and multisensory integration during self-motion. A number of open research questions are highlighted opening avenue for more active and systematic studies in this area. (C) 2009 Elsevier B.V. All rights reserved.
  •  
15.
  •  
16.
  • Väljamäe, Alexander, 1978, et al. (författare)
  • Auditory Presence, Individualized Head-Related Transfer Functions, and Illusory Ego-Motion in Virtual Environments
  • 2004
  • Ingår i: Proc. of 7th Annual Workshop Presence.
  • Konferensbidrag (refereegranskat)abstract
    • It is likely that experiences of presence and self-motion elicited by binaurally simulated and reproduced rotating sound fields can be degraded by the artifacts caused by the use of generic Head-Related Transfer Functions (HRTFs). In this paper, an HRTF measurement system which allows for fast data collection is discussed. Furthermore, effects of generic vs. individualized HRTFs were investigated in an experiment. Results show a significant increase in presence ratings of individualized binaural stimuli compared to responses to stimuli processed with generic HRTFs. Additionally, differences in intensity and convincingness of illusory self-rotation ratings were found for sub-groups of subjects, formed on the basis of subjects localization performance with the given HRTFs catalogues.
  •  
17.
  • Väljamäe, Alexander, 1978, et al. (författare)
  • Binaural bone-conducted sound in virtual environments
  • 2008
  • Ingår i: Acoustical Science and Technology. - 1347-5177 .- 1346-3969. ; 29:2, s. 149 - 155
  • Tidskriftsartikel (refereegranskat)abstract
    • Virtual and augmented reality applications provide us with increasingly compelling il- lusory worlds by combinations of different sensory cues. Although spatial sound technologies are often used in such applications, headphone based sound reproduction can create an undesired “mediation awareness” for an end-user. An alternative can be provided by bone-conducted sound technologies, traditionally used in hearing aids applications. Recent studies with bilaterally fitted bone-conduction transducers suggest that binaural sound cues can be rendered using this technology. In this paper we used binaural bone-conducted sound reproduction for enhancing a multi-modal self-motion sim- ulator prototype. Similar to previous results from headphone based reproduction, the present study shows that the addition of moving sound images to visual stimuli significantly increase vection and spatial presence responses. These results provide empirical evidence that convincing auditory scenes can be created using spatial bone-conducted sound and HRTF’s of at least 600 horizontal resolu- tion. The present research demonstrates the feasibility of using binaural bone-conducted sound in mediated environments.
  •  
18.
  • Väljamäe, Alexander, 1978, et al. (författare)
  • Filling-in visual motion with sounds
  • 2008
  • Ingår i: Acta Psychologica. - : Elsevier BV. - 0001-6918. ; 129:2, s. 249-254
  • Tidskriftsartikel (refereegranskat)abstract
    • Information about the motion of objects can be extracted by multiple sensory modalities, and, as a consequence, object motion perception typically involves the integration of multi-sensory information. Often, in naturalistic settings, the flow of such information can be rather discontinuous (e.g. a cat racing through the furniture in a cluttered room is partly seen and partly heard). This study addressed audiovisual interactions in the perception of time-sampled object motion by measuring adaptation aftereffects. We found significant auditory after-effects following adaptation to unisensory auditory and visual motion in depth, sampled at 12.5 Hz. The visually induced (cross-modal) auditory motion after-effect was eliminated if visual adaptors flashed at half of the rate (6.25 Hz). Remarkably, the addition of the highrate acoustic flutter (12.5 Hz) to this ineffective, sparsely time-sampled, visual adaptor restored the auditory after-effect to a level comparable to what was seen with high-rate bimodal adaptors (flashes and beeps). Our results suggest that this auditory-induced reinstatement of the motion after-effect from the poor visual signals resulted from the occurrence of sound-induced illusory flashes. This effect was found to be dependent both on the directional congruency between modalities and on the rate of auditory flutter. The auditory filling-in of time-sampled visual motion supports the feasibility of using reduced frame rate visual content in multisensory broadcasting and virtual reality applications. (C) 2008 Elsevier B.V. All rights reserved.
  •  
19.
  • Väljamäe, Alexander, 1978, et al. (författare)
  • Handheld Experiences:Using Audio To Enhance the Illusion of Self-Motion
  • 2008
  • Ingår i: IEEE J Multimedia. ; 15:4, s. 68 - 75
  • Tidskriftsartikel (refereegranskat)abstract
    • Handheld multimedia devices could benefit from multisensory technologies. The authors discuss audio, visual, and tactile cues designed to maximize presence and the illusion of self-motion.
  •  
20.
  • Väljamäe, Alexander, 1978, et al. (författare)
  • Perceptual Optimization of Audio-visual Media: Moved by sound.
  • 2007
  • Ingår i: Narration and Spectatorship Moving Images. Cambridge Scholars Publishing. (Anderson, B, and Anderson, J. (ed).
  • Bokkapitel (övrigt vetenskapligt/konstnärligt)abstract
    • Virtual Reality (VR) research is gradually shifting focus from pictorial to perceptual realism where the optimization of media synthesis and reproduction technologies is based on end-users’ subjective or objective responses. In this paper our work on multisensory perceptual optimization in motion simulators is presented. Spatial presence and illusory self-motion ratings were used to determine and evaluate the most instrumental acoustic cues in audio-visual or purely auditory virtual environments. Results show how sound can enhance users experience or, alternatively, compensate for a reduced visual representation. In addition, we present a pilot study in the cinema investigating the effects of minimized visual content on spatial presence and emotional responses. In conclusion, we discuss how similar experimental methodologies can advance the understanding of traditional audio-visual media perception mechanisms and test new multisensory media forms with a reduced cognitive load.
  •  
21.
  • Väljamäe, Alexander, 1978 (författare)
  • Self-motion and Presence in the Perceptual Optimization of a Multisensory Virtual Reality Environment
  • 2005
  • Licentiatavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Determining the perceptually optimal resolution of multisensory renderingmight help to foster the development of cost-effective, highly immersivemulti-modal displays for mediated environments (e.g. virtual and augmentedreality). The required sensory depth of stimulation can be quantified usinghuman centered methodologies where end user experiences serve as a basis foruni- and cross-modal optimization of the sensory inputs. In the psychophysicalstudies presented in this thesis, self-reported presence and illusoryself-motion (vection) indicated salience of auditory and multisensory cues indesign of perceptually optimized motion simulators.Contribution of auditory cues to illusory self-motion has been largelyneglected until very recently and papers A and B present studies on purelyauditory induced vection (AIV). Paper A shows that rotating auditory scenessynthesized using individualized Head-Related Transfer Functions (HRTFs) aremore instrumental for presence compared to generic binaural synthesis. Study ontranslational AIV in paper B shows that inconsistent auditory scene mightsignificantly decrease self-motion responses. Paper C and D demonstrate thatbi-sensory stimulations increase presence and self-motion ratings as expected.In paper C additional vibrotactile stimulation increased translational AIV andpresence ratings, especially for the stimuli containing the auditory-tactileengine metaphor. Paper D extended paper A results for rotational AIV showingthat spatial resolution of rotating auditory scenes can be greatly reduced whencombined with visual input.This thesis shows that sound plays important role in the illusory self-motionperception and it should be carefully used in multi-modal motion simulators. Thepresented findings suggest that a minimum set of acoustic cues can be sufficientfor eliciting a self-motion sensation, especially if other modalities areinvolved. However, perceptual consistency of the created auditory and multimodalscenes should be assured in the design of the next generation of motionsimulators.
  •  
22.
  • Väljamäe, Alexander, 1978 (författare)
  • Sound for Multisensory Motion Simulators
  • 2007
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Interaction in a virtual reality environment often implies situations of illusory self-motion, like, for example, in flight or driving scenarios. Striving for pictorial realism, currently available motion simulators often exhibit relatively poor sound design. However, a substantial body of research has now conclusively shown that human perception is multisensory in its nature. It is, therefore, logical to assume that acoustic information should contribute to the perception of illusory self-motion (vection). The presented studies used an iterative synthesis-evaluation loop where participants’ vection and presence (the sense of “being there”) responses guided the search for the most salient auditory cues and their multisensory combinations. Paper A provides a first integrative review on the studies related to auditory induced illusory vection, which have been scattered over time and different research disciplines. Paper B explores optimal combinations of perceptual cues between vision (field-of-view) and audition (spatial resolution) when presenting a rotating environment. Paper C examines cognitive factors in purely auditory or auditory-vibrotactile induced circular vection. In Paper D the specific influence of an audio-vibrotactile engine sound metaphor on linear vection responses are evaluated. The idea of using the engine sound representing self-motion or its multisensory counterparts is further addressed in Paper E where participant’s imagery vividness scores are also considered. The results from Papers B-E serve as a basis for the design of a transportable, multimodal motion simulator prototype. In paper F the feasibility of inducing vection by means of binaural bone conducted sound is tested using this prototype. Paper G outlines perceptually optimized, multisensory design which can be used in future motion simulators and discusses its possible implications for entertainment industries. To conclude, sound is an important but often neglected component in multisensory self-motion simulations, providing both perceptual and cognitive cues. Hence, it might be beneficial to think in terms of the amodal categories of unitary space, time, objects and events rather than to optimize vection cues in different modalities separately. The presented results have implications for various research areas including multisensory integration of self-motion cues, posture prosthesis, navigation in unusual gravitoinertial environments and applications for visually impaired.
  •  
23.
  • Väljamäe, Alexander, 1978, et al. (författare)
  • Sound representing self-motion in virtual environments enhances linear vection
  • 2008
  • Ingår i: Presence: Teleoperators and Virtual Environments. - : MIT Press - Journals. - 1531-3263 .- 1054-7460. ; 17:1, s. 43-56
  • Tidskriftsartikel (refereegranskat)abstract
    • Sound is an important, but often neglected, component for creating a self-motion illusion (vection) in Virtual Reality applications, for example, motion simulators. Apart from auditory motion cues, sound can provide contextual information representing self-motion in a virtual environment. In two experiments we investigated the benefits of hearing an engine sound when presenting auditory (Experiment 1) or auditory-vibrotactile (Experiment 2) virtual environments inducing linear vection. The addition of the engine sound to the auditory scene significantly enhanced subjective ratings of vection intensity in Experiment 1 and vection onset times but not subjective ratings in Experiment 2. Further analysis using individual imagery vividness scores showed that this disparity between vection measures was created by participants with higher kinesthetic imagery. On the other hand, for participants with lower kinesthetic imagery scores, the engine sound enhanced vection sensation in both experiments. A high correlation with participants' kinesthetic imagery vividness scores suggests the influence of a first person perspective in the perception of the engine sound. We hypothesize that self-motion sounds (e.g., the sound of footsteps, engine sound) represent a specific type of acoustic body-centered feedback in virtual environments. Therefore, the results may contribute to a better understanding of the role of self-representation sounds (sonic self-avatars), in virtual and augmented environments.
  •  
24.
  • Väljamäe, Alexander, 1978, et al. (författare)
  • Spatial sound in auditory vision substitution systems
  • 2006
  • Ingår i: Audio Engineering Society - 120th Convention Spring Preprints. - 9781604235975 - 9781604235975 ; , s. 486-493
  • Konferensbidrag (refereegranskat)abstract
    • Current auditory vision sensory substitution (AVSS) systems might be improved by the direct mapping of an image into a matrix of concurrently active sound sources in a virtual acoustic space. This mapping might be similar to the existing techniques for tactile substitution of vision where point arrays are successfully used. This paper gives an overview of the current auditory displays used to sonify 2D visual information and discuss the feasibility of new perceptually motivated AVSS methods encompassing spatial sound.
  •  
25.
  • Väljamäe, Alexander, 1978, et al. (författare)
  • Travelling without moving: Auditory scene cues for translational self-motion
  • 2005
  • Ingår i: Proceedings of International Conference on Auditory Display.
  • Tidskriftsartikel (refereegranskat)abstract
    • Creating a sense of illusory self-motion is crucial for many Virtual Reality applications and the auditory modality is an essential, but often neglected, component for such stimulations. In this paper, perceptual optimization of auditory-induced, translational self-motion (vection) simulation is studied using binaurally synthesized and reproduced sound fields. The results suggest that auditory scene consistency and ecologically validity makes a minimum set of acoustic cues sufficient for eliciting auditory-induced vection. Specifically, it was found that a focused attention task and sound objects motion characteristics (approaching or receding) play an important role in self-motion perception. In addition, stronger sensations for auditory induced self-translation than for previously investigated self-rotation also suggest a strong ecological validity bias, as translation is the most common movement direction.
  •  
26.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-26 av 26

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy