SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Belpaeme Tony) srt2:(2015-2019)"

Search: WFRF:(Belpaeme Tony) > (2015-2019)

  • Result 1-6 of 6
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Bartlett, Madeleine, et al. (author)
  • What Can You See? : Identifying Cues on Internal States From the Movements of Natural Social Interactions
  • 2019
  • In: Frontiers in Robotics and AI. - : Frontiers Research Foundation. - 2296-9144. ; 6:49
  • Journal article (peer-reviewed)abstract
    • In recent years, the field of Human-Robot Interaction (HRI) has seen an increasingdemand for technologies that can recognize and adapt to human behaviors and internalstates (e.g., emotions and intentions). Psychological research suggests that humanmovements are important for inferring internal states. There is, however, a need to betterunderstand what kind of information can be extracted from movement data, particularlyin unconstrained, natural interactions. The present study examines which internal statesand social constructs humans identify from movement in naturalistic social interactions.Participants either viewed clips of the full scene or processed versions of it displaying2D positional data. Then, they were asked to fill out questionnaires assessing their socialperception of the viewed material. We analyzed whether the full scene clips were moreinformative than the 2D positional data clips. First, we calculated the inter-rater agreementbetween participants in both conditions. Then, we employed machine learning classifiersto predict the internal states of the individuals in the videos based on the ratingsobtained. Although we found a higher inter-rater agreement for full scenes comparedto positional data, the level of agreement in the latter case was still above chance,thus demonstrating that the internal states and social constructs under study wereidentifiable in both conditions. A factor analysis run on participants’ responses showedthat participants identified the constructs interaction imbalance, interaction valence andengagement regardless of video condition. The machine learning classifiers achieveda similar performance in both conditions, again supporting the idea that movementalone carries relevant information. Overall, our results suggest it is reasonable to expecta machine learning algorithm, and consequently a robot, to successfully decode andclassify a range of internal states and social constructs using low-dimensional data (suchas the movements and poses of observed individuals) as input.
  •  
2.
  • Cai, Haibin, et al. (author)
  • Sensing-enhanced Therapy System for Assessing Children with Autism Spectrum Disorders : A Feasibility Study
  • 2019
  • In: IEEE Sensors Journal. - : Institute of Electrical and Electronics Engineers (IEEE). - 1530-437X .- 1558-1748. ; 19:4, s. 1508-1518
  • Journal article (peer-reviewed)abstract
    • It is evident that recently reported robot-assisted therapy systems for assessment of children with autism spectrum disorder (ASD) lack autonomous interaction abilities and require significant human resources. This paper proposes a sensing system that automatically extracts and fuses sensory features such as body motion features, facial expressions, and gaze features, further assessing the children behaviours by mapping them to therapist-specified behavioural classes. Experimental results show that the developed system has a capability of interpreting characteristic data of children with ASD, thus has the potential to increase the autonomy of robots under the supervision of a therapist and enhance the quality of the digital description of children with ASD. The research outcomes pave the way to a feasible machine-assisted system for their behaviour assessment. IEEE
  •  
3.
  •  
4.
  • Esteban, Pablo G., et al. (author)
  • How to Build a Supervised Autonomous System for Robot-Enhanced Therapy for Children with Autism Spectrum Disorder
  • 2017
  • In: Paladyn - Journal of Behavioral Robotics. - : De Gruyter Open. - 2080-9778 .- 2081-4836. ; 8:1, s. 18-38
  • Journal article (peer-reviewed)abstract
    • Robot-Assisted Therapy (RAT) has successfully been used to improve social skills in children with autism spectrum disorders (ASD) through remote control of the robot in so-called Wizard of Oz (WoZ) paradigms.However, there is a need to increase the autonomy of the robot both to lighten the burden on human therapists (who have to remain in control and, importantly, supervise the robot) and to provide a consistent therapeutic experience. This paper seeks to provide insight into increasing the autonomy level of social robots in therapy to move beyond WoZ. With the final aim of improved human-human social interaction for the children, this multidisciplinary research seeks to facilitate the use of social robots as tools in clinical situations by addressing the challenge of increasing robot autonomy.We introduce the clinical framework in which the developments are tested, alongside initial data obtained from patients in a first phase of the project using a WoZ set-up mimicking the targeted supervised-autonomy behaviour. We further describe the implemented system architecture capable of providing the robot with supervised autonomy.
  •  
5.
  • Richardson, Kathleen, et al. (author)
  • Robot Enhanced Therapy for Children with Autism (DREAM) : A Social Model of Autism
  • 2018
  • In: IEEE technology & society magazine. - : IEEE. - 0278-0097 .- 1937-416X. ; 37:1, s. 30-39
  • Journal article (peer-reviewed)abstract
    • The development of social robots for children with autism has been a growth field for the past 15 years. This article reviews studies in robots and autism as a neurodevelopmental disorder that impacts socialcommunication development, and the ways social robots could help children with autism develop social skills. Drawing on ethics research from the EU-funded Development of Robot-Enhanced Therapy for Children with Autism (DREAM) project (framework 7), this paper explores how ethics evolves and developed in this European project.
  •  
6.
  • Wolfert, Pieter, et al. (author)
  • Should Beat Gestures Be Learned Or Designed? : A Benchmarking User Study
  • 2019
  • In: ICDL-EPIROB 2019. - : IEEE conference proceedings.
  • Conference paper (peer-reviewed)abstract
    • In this paper, we present a user study on gener-ated beat gestures for humanoid agents. It has been shownthat Human-Robot Interaction can be improved by includingcommunicative non-verbal behavior, such as arm gestures. Beatgestures are one of the four types of arm gestures, and are knownto be used for emphasizing parts of speech. In our user study,we compare beat gestures learned from training data with hand-crafted beat gestures. The first kind of gestures are generatedby a machine learning model trained on speech audio andhuman upper body poses. We compared this approach with threehand-coded beat gestures methods: designed beat gestures, timedbeat gestures, and noisy gestures. Forty-one subjects participatedin our user study, and a ranking was derived from pairedcomparisons using the Bradley Terry Luce model. We found thatfor beat gestures, the gestures from the machine learning modelare preferred, followed by algorithmically generated gestures.This emphasizes the promise of machine learning for generating communicative actions.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-6 of 6

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view