SwePub
Tyck till om SwePub Sök här!
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "hsv:(NATURVETENSKAP) hsv:(Data och informationsvetenskap) hsv:(Annan data och informationsvetenskap) ;pers:(Bresin Roberto 1963)"

Sökning: hsv:(NATURVETENSKAP) hsv:(Data och informationsvetenskap) hsv:(Annan data och informationsvetenskap) > Bresin Roberto 1963

  • Resultat 1-10 av 18
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Frid, Emma, et al. (författare)
  • Perception of Mechanical Sounds Inherent to Expressive Gestures of a NAO Robot - Implications for Movement Sonification of Humanoids
  • 2018
  • Ingår i: Proceedings of the 15th Sound and Music Computing Conference. - Limassol, Cyprus. - 9789963697304
  • Konferensbidrag (refereegranskat)abstract
    • In this paper we present a pilot study carried out within the project SONAO. The SONAO project aims to compen- sate for limitations in robot communicative channels with an increased clarity of Non-Verbal Communication (NVC) through expressive gestures and non-verbal sounds. More specifically, the purpose of the project is to use move- ment sonification of expressive robot gestures to improve Human-Robot Interaction (HRI). The pilot study described in this paper focuses on mechanical robot sounds, i.e. sounds that have not been specifically designed for HRI but are inherent to robot movement. Results indicated a low correspondence between perceptual ratings of mechanical robot sounds and emotions communicated through ges- tures. In general, the mechanical sounds themselves ap- peared not to carry much emotional information compared to video stimuli of expressive gestures. However, some mechanical sounds did communicate certain emotions, e.g. frustration. In general, the sounds appeared to commu- nicate arousal more effectively than valence. We discuss potential issues and possibilities for the sonification of ex- pressive robot gestures and the role of mechanical sounds in such a context. Emphasis is put on the need to mask or alter sounds inherent to robot movement, using for exam- ple blended sonification.
  •  
2.
  • Frid, Emma, 1988-, et al. (författare)
  • Perceptual Evaluation of Blended Sonification of Mechanical Robot Sounds Produced by Emotionally Expressive Gestures : Augmenting Consequential Sounds to Improve Non-verbal Robot Communication
  • 2021
  • Ingår i: International Journal of Social Robotics. - : Springer Nature. - 1875-4791 .- 1875-4805.
  • Tidskriftsartikel (refereegranskat)abstract
    • This paper presents two experiments focusing on perception of mechanical sounds produced by expressive robot movement and blended sonifications thereof. In the first experiment, 31 participants evaluated emotions conveyed by robot sounds through free-form text descriptions. The sounds were inherently produced by the movements of a NAO robot and were not specifically designed for communicative purposes. Results suggested no strong coupling between the emotional expression of gestures and how sounds inherent to these movements were perceived by listeners; joyful gestures did not necessarily result in joyful sounds. A word that reoccurred in text descriptions of all sounds, regardless of the nature of the expressive gesture, was “stress”. In the second experiment, blended sonification was used to enhance and further clarify the emotional expression of the robot sounds evaluated in the first experiment. Analysis of quantitative ratings of 30 participants revealed that the blended sonification successfully contributed to enhancement of the emotional message for sound models designed to convey frustration and joy. Our findings suggest that blended sonification guided by perceptual research on emotion in speech and music can successfully improve communication of emotions through robot sounds in auditory-only conditions.
  •  
3.
  • Latupeirissa, Adrian Benigno, et al. (författare)
  • Exploring emotion perception in sonic HRI
  • 2020
  • Ingår i: 17th Sound and Music Computing Conference. - Torino : Zenodo. ; , s. 434-441
  • Konferensbidrag (refereegranskat)abstract
    • Despite the fact that sounds produced by robots can affect the interaction with humans, sound design is often an overlooked aspect in Human-Robot Interaction (HRI). This paper explores how different sets of sounds designed for expressive robot gestures of a humanoid Pepper robot can influence the perception of emotional intentions. In the pilot study presented in this paper, it has been asked to rate different stimuli in terms of perceived affective states. The stimuli were audio, audio-video and video only and contained either Pepper’s original servomotors noises, sawtooth, or more complex designed sounds. The preliminary results show a preference for the use of more complex sounds, thus confirming the necessity of further exploration in sonic HRI.
  •  
4.
  • Frid, Emma, et al. (författare)
  • An Exploratory Study On The Effect Of Auditory Feedback On Gaze Behavior In a Virtual Throwing Task With and Without Haptic Feedback
  • 2017
  • Ingår i: Proceedings of the 14th Sound and Music Computing Conference. - Espoo, Finland : Aalto University. - 9789526037295 ; , s. 242-249
  • Konferensbidrag (refereegranskat)abstract
    • This paper presents findings from an exploratory study on the effect of auditory feedback on gaze behavior. A total of 20 participants took part in an experiment where the task was to throw a virtual ball into a goal in different conditions: visual only, audiovisual, visuohaptic and audio- visuohaptic. Two different sound models were compared in the audio conditions. Analysis of eye tracking metrics indicated large inter-subject variability; difference between subjects was greater than difference between feedback conditions. No significant effect of condition could be observed, but clusters of similar behaviors were identified. Some of the participants’ gaze behaviors appeared to have been affected by the presence of auditory feedback, but the effect of sound model was not consistent across subjects. We discuss individual behaviors and illustrate gaze behavior through sonification of gaze trajectories. Findings from this study raise intriguing questions that motivate future large-scale studies on the effect of auditory feedback on gaze behavior. 
  •  
5.
  •  
6.
  •  
7.
  • Bresin, Roberto, 1963-, et al. (författare)
  • Looking for the soundscape of the future : preliminary results applying the design fiction method
  • 2020
  • Ingår i: Sound and Music Computing Conference 2020.
  • Konferensbidrag (refereegranskat)abstract
    • The work presented in this paper is a preliminary study in a larger project that aims to design the sound of the future through our understanding of the soundscapes of the present, and through methods of documentary filmmaking, sound computing and HCI. This work is part of a project that will complement and run parallel to Erik Gandini’s research project ”The Future through the Present”, which explores how a documentary narrative can create a projection into the future, and develop a cinematic documentary aesthetics that releases documentary film from the constraints of dealing with the present or the past. The point of departure is our relationship to labour at a time when Robotics, VR/AR and AI applied to Big Data outweigh and augment our physical and cognitive capabilities, with automation expected to replace humans on a large scale within most professional fields. From an existential perspective this poses the question: what will we do when we don’t have to work? And challenges us to formulate a new idea of work beyond its historical role. If the concept of work ethics changes, how would that redefine soundscapes? Will new sounds develop? Will sounds from the past resurface? In the context of this paper we try to tackle these questions by first applying the Design Fiction method. In a workshop with twenty-three participants predicted both positive and negative future scenarios, including both lo-fi and hi-fi soundscapes, and in which people will be able to control and personalize soundscapes. Results are presented, summarized and discussed.
  •  
8.
  •  
9.
  • Frid, Emma, et al. (författare)
  • Interactive Sonification of Spontaneous Movement of Children : Cross-Modal Mapping and the Perception of Body Movement Qualities through Sound
  • 2016
  • Ingår i: Frontiers in Neuroscience. - : Frontiers Media S.A.. - 1662-4548 .- 1662-453X. ; 10
  • Tidskriftsartikel (refereegranskat)abstract
    • In this paper we present three studies focusing on the effect of different sound models in interactive sonification of bodily movement. We hypothesized that a sound model characterized by continuous smooth sounds would be associated with other movement characteristics than a model characterized by abrupt variation in amplitude and that these associations could be reflected in spontaneous movement characteristics. Three subsequent studies were conducted to investigate the relationship between properties of bodily movement and sound: (1) a motion capture experiment involving interactive sonification of a group of children spontaneously moving in a room, (2) an experiment involving perceptual ratings of sonified movement data and (3) an experiment involving matching between sonified movements and their visualizations in the form of abstract drawings. In (1) we used a system constituting of 17 IR cameras tracking passive reflective markers. The head positions in the horizontal plane of 3-4 children were simultaneously tracked and sonified, producing 3-4 sound sources spatially displayed through an 8-channel loudspeaker system. We analyzed children’s spontaneous movement in terms of energy-, smoothness- and directness index. Despite large inter-participant variability and group-specific effects caused by interaction among children when engaging in the spontaneous movement task, we found a small but significant effect of sound model. Results from (2) indicate that different sound models can be rated differently on a set of motion-related perceptual scales (e.g. expressivity and fluidity). Also, results imply that audio-only stimuli can evoke stronger perceived properties of movement (e.g. energetic, impulsive) than stimuli involving both audio and video representations. Findings in (3) suggest that sounds portraying bodily movement can be represented using abstract drawings in a meaningful way. We argue that the results from these studies support the existence of a cross-modal mapping of body motion qualities from bodily movement to sounds. Sound can be translated and understood from bodily motion, conveyed through sound visualizations in the shape of drawings and translated back from sound visualizations to audio. The work underlines the potential of using interactive sonification to communicate high-level features of human movement data.
  •  
10.
  • Han, Xu, et al. (författare)
  • Performance of piano trills: effects of hands, fingers, notes and emotions
  • 2019
  • Ingår i: Combined proceedings of the Nordic Sound and Music Computing Conference 2019 and the Interactive Sonification Workshop 2019. - Stockholm. ; , s. 9-15
  • Konferensbidrag (refereegranskat)abstract
    • Trill is a type of musical ornament. In automatic playback of piano music scores, trills are usually synthesised as a sequence of repeated notes with equal duration and dynamic level. This is not how trills are performed by pianists. In this study, trills were performed by three pianists on a Yamaha Disklavier and recorded as both audio and MIDI files. Then note duration, inter-onset interval (IOI) and key velocity for each note were extracted from MIDI files and analyzed in relation to hands, notes and emotions. Four significant effects were found; 1) hand effect: trills on right hand were in average performed with a faster rate, shorter note duration, longer off duration and faster key velocity, 2) finger effect: within the two notes forming a trill, notes with lower fingering number were performed with shorter off duration, while keeping note duration and key velocity close, 3) emotion effect: emotion mainly contributed to dynamic level, 4) crescendo effect: when crescendo happened, note duration and off duration compensated with each other and kept IOI at a almost constant value.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-10 av 18

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy