SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Camurri Antonio) "

Sökning: WFRF:(Camurri Antonio)

  • Resultat 1-8 av 8
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Bienkiewicz, Marta M. N., et al. (författare)
  • Bridging the gap between emotion and joint action
  • 2021
  • Ingår i: Neuroscience and Biobehavioral Reviews. - : Elsevier BV. - 0149-7634 .- 1873-7528. ; 131, s. 806-833
  • Tidskriftsartikel (refereegranskat)abstract
    • Our daily human life is filled with a myriad of joint action moments, be it children playing, adults working together (i.e., team sports), or strangers navigating through a crowd. Joint action brings individuals (and embodiment of their emotions) together, in space and in time. Yet little is known about how individual emotions propagate through embodied presence in a group, and how joint action changes individual emotion. In fact, the multi-agent component is largely missing from neuroscience-based approaches to emotion, and reversely joint action research has not found a way yet to include emotion as one of the key parameters to model socio-motor interaction. In this review, we first identify the gap and then stockpile evidence showing strong entanglement between emotion and acting together from various branches of sciences. We propose an integrative approach to bridge the gap, highlight five research avenues to do so in behavioral neuroscience and digital sciences, and address some of the key challenges in the area faced by modern societies.
  •  
2.
  •  
3.
  • Camurri, Antonio, et al. (författare)
  • User-centric context-aware mobile applications for embodied music listening
  • 2009
  • Ingår i: User Centric Media. - Heidelberg : Springer Berlin. - 9783642126307 ; , s. 21-30
  • Bokkapitel (refereegranskat)abstract
    • This paper surveys a collection of sample applications for networked user-centric context-aware embodied music listening. The applications have been designed and developed in the framework of the EU-ICT Project SAME (www.sameproject.eu) and have been presented at Agora Festival (IRCAM, Paris, France) in June 2009. All of them address in different ways the concept of embodied, active listening to music, i.e., enabling listeners to interactively operate in real-time on the music content by means of their movements and gestures as captured by mobile devices. In the occasion of the Agora Festival the applications have also been evaluated by both expert and non-expert users
  •  
4.
  • Castellano, Ginevra, et al. (författare)
  • Expressive Control of Music and Visual Media by Full-Body Movement
  • 2007
  • Ingår i: Proceedings of the 7th International Conference on New Interfaces for Musical Expression, NIME '07. - New York, NY, USA : ACM Press. ; , s. 390-391
  • Konferensbidrag (refereegranskat)abstract
    • In this paper we describe a system which allows users to use their full-body for controlling in real-time the generation of an expressive audio-visual feedback. The system extracts expressive motion features from the user’s full-body movements and gestures. The values of these motion features are mapped both onto acoustic parameters for the real-time expressive rendering ofa piece of music, and onto real-time generated visual feedback projected on a screen in front of the user.
  •  
5.
  • Castellano, Ginevra, et al. (författare)
  • User-Centered Control of Audio and Visual Expressive Feedback by Full-Body Movements
  • 2007
  • Ingår i: Affective Computing and Intelligent Interaction. - Berlin / Heidelberg : Springer Berlin/Heidelberg. - 9783540748885 ; , s. 501-510
  • Bokkapitel (refereegranskat)abstract
    • In this paper we describe a system allowing users to express themselves through their full-body movement and gesture and to control in real-time the generation of an audio-visual feedback. The systems analyses in real-time the user’s full-body movement and gesture, extracts expressive motion features and maps the values of the expressive motion features onto real-time control of acoustic parameters for rendering a music performance. At the same time, a visual feedback generated in real-time is projected on a screen in front of the users with their coloured silhouette, depending on the emotion their movement communicates. Human movement analysis and visual feedback generation were done with the EyesWeb software platform and the music performance rendering with pDM. Evaluation tests were done with human participants to test the usability of the interface and the effectiveness of the design.
  •  
6.
  • Olugbade, Temitayo, et al. (författare)
  • Human Movement Datasets : An Interdisciplinary Scoping Review
  • 2023
  • Ingår i: ACM Computing Surveys. - : Association for Computing Machinery (ACM). - 0360-0300 .- 1557-7341. ; 55:6
  • Forskningsöversikt (refereegranskat)abstract
    • Movement dataset reviews exist but are limited in coverage, both in terms of size and research discipline. While topic-specific reviews clearly have their merit, it is critical to have a comprehensive overview based on a systematic survey across disciplines. This enables higher visibility of datasets available to the research communities and can foster interdisciplinary collaborations. We present a catalogue of 704 open datasets described by 10 variables that can be valuable to researchers searching for secondary data: name and reference, creation purpose, data type, annotations, source, population groups, ordinal size of people captured simultaneously, URL, motion capture sensor, and funders. The catalogue is available in the supplementary materials. We provide an analysis of the datasets and further review them under the themes of human diversity, ecological validity, and data recorded. The resulting 12-dimension framework can guide researchers in planning the creation of open movement datasets. This work has been the interdisciplinary effort of researchers across affective computing, clinical psychology, disability innovation, ethnomusicology, human-computer interaction, machine learning, music cognition, music computing, and movement neuroscience.
  •  
7.
  • Serra, Xavier, et al. (författare)
  • Sound and music computing : Challenges and strategies
  • 2007
  • Ingår i: Journal of New Music Research. - : Informa UK Limited. - 0929-8215 .- 1744-5027. ; 36:3, s. 185-190
  • Tidskriftsartikel (refereegranskat)abstract
    • Based on the current context of the Sound and Music Computing (SMC) field, the state of the art in research and the open issues that have been identified and described in other articles of this journal issue, in this article we make a step forward and try to identify the broad SMC challenges and we propose strategies with which to tackle them. On the research side we identify a clear need for designing better sound objects and environments and for promoting research to understand, model, and improve human interaction with sound and music. In the education domain we feel the need for better training our multidisciplinary researchers and to make sure that they can contribute to the multicultural society we live in. There is also a clear need for improving the transferring of the knowledge and technologies generated by our community. Finally we claim that the SMC field should be very much concerned with its social context and that a number of current social concerns should be addressed. We accompany each of these challenges with strategies that should help researchers, educators and policy makers take specific actions to advance in the proposed SMC roadmap.
  •  
8.
  • Varni, Giovanna, et al. (författare)
  • Interactive sonification of synchronisation of motoric behaviour in social active listening to music with mobile devices
  • 2012
  • Ingår i: Journal on Multimodal User Interfaces. - : Springer Berlin/Heidelberg. - 1783-7677 .- 1783-8738. ; 5:3-4, s. 157-173
  • Tidskriftsartikel (refereegranskat)abstract
    • This paper evaluates three different interactive sonifications of dyadic coordinated human rhythmic activity. An index of phase synchronisation of gestures was chosen as coordination metric. The sonifications are implemented as three prototype applications exploiting mobile devices: Sync’n’Moog, Sync’n’Move, and Sync’n’Mood. Sync’n’Moog sonifies the phase synchronisation index by acting directly on the audio signal and applying a nonlinear time-varying filtering technique. Sync’n’Move intervenes on the multi-track music content by making the single instruments emerge and hide. Sync’n’Mood manipulates the affective features of the music performance. The three sonifications were also tested against a condition without sonification.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-8 av 8

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy