SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Väljamäe Alexander 1978) srt2:(2007)"

Sökning: WFRF:(Väljamäe Alexander 1978) > (2007)

  • Resultat 1-5 av 5
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Larsson, Pontus, 1974, et al. (författare)
  • Auditory-visual perception of room size in virtual environments
  • 2007
  • Ingår i: Proceedings of the 19th International Congress on Acoustics, September 2-7, 2007, Madrid, Spain. - 8487985122 ; , s. PPA-03-001-
  • Konferensbidrag (övrigt vetenskapligt/konstnärligt)abstract
    • It is generally believed that the effectiveness of Virtual Environments (VEs) relies on their ability of faithfully reproducing the multisensory experience of the physical world. An important aspect of this experience is the perception of size and distance. In e.g. an architectural application it is of course of great interest that the user gets the correct impression of room size. However, considering visual perception VE it is yet not fully understood what system parameters control room size. Some investigations on auditory distance perception have been carried out, but there is also an obvious lack of research concerning auditory room size perception. In addition, it is far from understood how audition and vision interact when sensing an indoor environment. The current paper reviews an experiment aimed at exploring aspects of auditory, visual and auditory-visual room size perception in VEs. In line with previous research, it is found that people in general seem to underestimate room size when exposed to a visual VE. It is also shown that there seems to be a tendency to overestimate room size in auditory VEs, and finally that the combination of auditory and visual stimuli allows for a more accurate room size perception.
  •  
2.
  • Tajadura, Ana, 1979, et al. (författare)
  • Whole-body vibration influence sound localization in the median plane.
  • 2007
  • Ingår i: 10th Annual International Workshop on Presence, Barcelona, Spain, October 2007..
  • Konferensbidrag (refereegranskat)abstract
    • The perceived location of events occurring in a mediated environment modulates the users’ understanding and involvement in these events. Previous research has shown that when spatially discrepant information is available at various sensory channels, the perceived location of unisensory events might be altered. Tactile “capture” of audition has been reported for lateral sounds. The present study investigates whether auditory localization on the median plane could be altered by concurrent whole-body vibration. Sounds were presented at the front or the back of participants, in isolation or together with vibrations. Subjects made a three alternative forced choice regarding their perceived location of sound (“front”, “back” or “center”). Results indicate that vibrations synchronous with sound affected subjects’ sound localization, significantly reducing the accuracy on front sound localization in favor of “back” and “center” responses. This research might have implications for the design of multimodal environments, especially for those aiming at creating a sense of presence or inducing affective experiences in users.
  •  
3.
  • Väljamäe, Alexander, 1978, et al. (författare)
  • AUDIO-VISUAL INTERACTIONS IN DYNAMIC SCENES: IMPLICATIONS FOR MULTISENSORY COMPRESSION
  • 2007
  • Ingår i: Proceeding of ICA'07, Madrid.
  • Konferensbidrag (refereegranskat)abstract
    • New media technologies enrich human capabilities for generation, transmission and representation of audio-visual content. Knowledge about human sensory and cognitive processing is critical to advance in these technologies. Traditionally, media technologies have adopted a unimodal view on human perception using, for example, separate compression modules for audio and visual data streams. Drawing on the neuroscience advances that have revealed the strongly multisensory nature of human perception, we suggest that audio-visual content processing might benefit from adopting a multimodal approach that is tailored to the rules of human multisensory processing. While visual dominance in spatial domain is largely known for static scenes, such as in the famous “ventriloquism effect”, more complex interactions emerge for dynamic audio-visual stimuli. In this paper we first review some studies on “dynamic ventriloquism” effect where visual motion captures the perceived direction of auditory motion. Second, we show how rhythmic sound patterns fill-in temporally degraded visual motion based on recently discovered “auditory-induced visual flash illusion”. Finally, we discuss the implications of these findings for multisensory processing and compression techniques.
  •  
4.
  • Väljamäe, Alexander, 1978, et al. (författare)
  • Perceptual Optimization of Audio-visual Media: Moved by sound.
  • 2007
  • Ingår i: Narration and Spectatorship Moving Images. Cambridge Scholars Publishing. (Anderson, B, and Anderson, J. (ed).
  • Bokkapitel (övrigt vetenskapligt/konstnärligt)abstract
    • Virtual Reality (VR) research is gradually shifting focus from pictorial to perceptual realism where the optimization of media synthesis and reproduction technologies is based on end-users’ subjective or objective responses. In this paper our work on multisensory perceptual optimization in motion simulators is presented. Spatial presence and illusory self-motion ratings were used to determine and evaluate the most instrumental acoustic cues in audio-visual or purely auditory virtual environments. Results show how sound can enhance users experience or, alternatively, compensate for a reduced visual representation. In addition, we present a pilot study in the cinema investigating the effects of minimized visual content on spatial presence and emotional responses. In conclusion, we discuss how similar experimental methodologies can advance the understanding of traditional audio-visual media perception mechanisms and test new multisensory media forms with a reduced cognitive load.
  •  
5.
  • Väljamäe, Alexander, 1978 (författare)
  • Sound for Multisensory Motion Simulators
  • 2007
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Interaction in a virtual reality environment often implies situations of illusory self-motion, like, for example, in flight or driving scenarios. Striving for pictorial realism, currently available motion simulators often exhibit relatively poor sound design. However, a substantial body of research has now conclusively shown that human perception is multisensory in its nature. It is, therefore, logical to assume that acoustic information should contribute to the perception of illusory self-motion (vection). The presented studies used an iterative synthesis-evaluation loop where participants’ vection and presence (the sense of “being there”) responses guided the search for the most salient auditory cues and their multisensory combinations. Paper A provides a first integrative review on the studies related to auditory induced illusory vection, which have been scattered over time and different research disciplines. Paper B explores optimal combinations of perceptual cues between vision (field-of-view) and audition (spatial resolution) when presenting a rotating environment. Paper C examines cognitive factors in purely auditory or auditory-vibrotactile induced circular vection. In Paper D the specific influence of an audio-vibrotactile engine sound metaphor on linear vection responses are evaluated. The idea of using the engine sound representing self-motion or its multisensory counterparts is further addressed in Paper E where participant’s imagery vividness scores are also considered. The results from Papers B-E serve as a basis for the design of a transportable, multimodal motion simulator prototype. In paper F the feasibility of inducing vection by means of binaural bone conducted sound is tested using this prototype. Paper G outlines perceptually optimized, multisensory design which can be used in future motion simulators and discusses its possible implications for entertainment industries. To conclude, sound is an important but often neglected component in multisensory self-motion simulations, providing both perceptual and cognitive cues. Hence, it might be beneficial to think in terms of the amodal categories of unitary space, time, objects and events rather than to optimize vection cues in different modalities separately. The presented results have implications for various research areas including multisensory integration of self-motion cues, posture prosthesis, navigation in unusual gravitoinertial environments and applications for visually impaired.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-5 av 5

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy