SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "L773:2167 2148 "

Search: L773:2167 2148

  • Result 1-10 of 10
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Baldursson, Birgir, et al. (author)
  • DroRun: Drone visual interactions to mediate a running group
  • 2021
  • In: ACM/IEEE International Conference on Human-Robot Interaction. - New York, NY, USA : ACM. - 2167-2148. ; , s. 148-152
  • Conference paper (peer-reviewed)abstract
    • Running with others is motivational but finding running partners is not always accessible due to constraints such as being in remote locations. We present a novel concept of augmenting remote runners' experience by mediating a running group using drone-projected visualisations. As an initial step, a team of five interaction designers apply a user-centred design approach to scope out possible visual prototypes that can mediate a running group. The design team specifically focused on visual prototypes through a series of workshops. We report on the impressions of the visual prototypes from six potential users and present future directions to this project.
  •  
2.
  • Cabrio, Alessandro, et al. (author)
  • HighLight: Towards an Ambient Robotic Table as a Social Enabler
  • 2023
  • In: ACM/IEEE International Conference on Human-Robot Interaction. - New York, NY, USA : ACM. - 2167-2148. ; , s. 146-150
  • Conference paper (peer-reviewed)abstract
    • With smartphones becoming more commonplace in our daily lives, they often take up more time and space than we would like them to. Research shows that using smartphones during social interactions does more harm than good. With this in mind, we set out to create the first prototype of an ambient robotic table that will support social interactions and discourage digital distractions. Through a rapid prototyping process, we present HighLight, a prototype of a socially enabling robotic table that has a smartphone compartment in its center and ambient features reacting in real-time to conversations taking place around the table. We report on our contributions to the research community by investigating the design of an ambient robotic table as a social enabler that encourages social interactions through ambiance, thus exploring future directions of non-disruptive technologies that support social interactions.
  •  
3.
  • Herben, Jonte, et al. (author)
  • An Expressive Robotic Table to Enhance Social Interactions
  • 2023
  • In: ACM/IEEE International Conference on Human-Robot Interaction. - New York, NY, USA : ACM. - 2167-2148. ; , s. 151-155
  • Conference paper (peer-reviewed)abstract
    • We take initial steps into prototyping an expressive robotic table that can serve as a social mediator. The work is constructed through a rapid prototyping process consisting of five workshopbased phases with five interaction design participants. We report on the various prototyping techniques that led to the generated concept of an expressive robotic table. Our design process explores how expressive motion cues such as respiratory movements can be leveraged to mediate social interactions between people in cold outdoor environments. We conclude by discussing the implications of the different prototyping methods applied and the envisioned future directions of the work within the scope of expressive robotics.
  •  
4.
  • Jena, Ayesha, et al. (author)
  • Chaos to Control : Human Assisted Scene Inspection
  • 2023
  • In: HRI 2023 - Companion of the ACM/IEEE International Conference on Human-Robot Interaction. - New York, NY, USA : ACM. - 2167-2148. - 9781450399708 ; , s. 491-494
  • Conference paper (peer-reviewed)abstract
    • We are working towards a mixed reality-based human-robot collaboration interface using gaze and gesture as methods of communicating intent in a search and rescue scenario to optimize the operation. The lack of mature algorithms and control schemes for autonomous systems makes it still difficult for them to operate safely in high-risk environments. We are approaching the problem through symbiosis while utilizing humans' intuition of the environment and robots' capability to travel through unknown environments for optimal performance in a given time.
  •  
5.
  • Nielsen, Stig Anton, 1981, et al. (author)
  • Embodied computation in soft gripper
  • 2014
  • In: ACM/IEEE International Conference on Human-Robot Interaction. - New York, NY, USA : ACM. - 2167-2148. ; , s. 256-257
  • Conference paper (peer-reviewed)abstract
    • We designed, created and tested an underactuated soft gripper able to hold everyday objects of various shapes and sizes without using complex hardware or control algorithms, but rather by combining sheets of flexible plastic materials and a single servo motor. Starting with a prototype where simple actuation performs complex varied gripping operations solely through the material system and its inherent physical computation, the paper discusses how embodied computation might exist in a material aggregate by tuning and balancing its morphology and material properties.
  •  
6.
  • Phaijit, Ornnalin, et al. (author)
  • A Demonstration of the Taxonomy of Functional Augmented Reality for Human-Robot Interaction
  • 2022
  • In: ACM/IEEE International Conference on Human-Robot Interaction. - 2167-2148. ; 2022-March, s. 981-985
  • Conference paper (peer-reviewed)abstract
    • With the rising use of Augmented Reality (AR) technologies in Human-Robot Interaction (HRI), it is crucial that HRI research examines the role of AR in HRI to better define AR-HRI systems and identify potential areas for future research. A taxonomy for AR in HRI has recently been proposed for the field. However, it was limited to the definition of the framework, and exemplifying its use was missing. In this paper, we perform a demonstration of how the aforementioned taxonomy of AR in HRI can be used to analyse an existing AR-HRI system and come up with questions for alternative ways AR-HRI could be designed and further extended.
  •  
7.
  • Phaijit, Ornnalin, et al. (author)
  • A Taxonomy of Functional Augmented Reality for Human-Robot Interaction
  • 2022
  • In: ACM/IEEE International Conference on Human-Robot Interaction. - 2167-2148. ; 2022-March, s. 294-303
  • Conference paper (peer-reviewed)abstract
    • Augmented reality (AR) technologies are today more frequently being introduced to Human-Robot Interaction (HRI) to mediate the interaction between human and robot. Indeed, better technical support and improved framework integration allow the design and study of novel scenarios augmenting interaction with AR. While some literature reviews have been published, so far no classifications have been devised for the role of AR in HRI. AR constitutes a vast field of research in HCI, and as it is picking up in HRI, it is timely to articulate the current knowledge and information about the functionalities of AR in HRI. Here we propose a multidimensional taxonomy for AR in HRI that distinguishes the type of perception augmentation, the functional role of AR, and the augmentation artifact type. We place sample publications within the taxonomy to demonstrate its utility. Lastly, we derive from the taxonomy some research gaps in current AR-for-HRI research and provide suggestions for exploration beyond the current state-of-the-art.
  •  
8.
  • Reitmann, Stefan, et al. (author)
  • VR-based Assistance System for Semi-Autonomous Robotic Boats
  • 2024
  • In: HRI 2024 Companion - Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction. - 2167-2148. - 9798400703232 ; , s. 877-881
  • Conference paper (peer-reviewed)abstract
    • In this paper we present the concept for a teleoperation system for semi-autonomous robotic boats using virtual reality. This system can be used for monitoring autonomous driving as well as for direct manual control. The integration of live sensor data is possible as well as the integration of past measurement results and their correct registration within the virtual representation. Initial field tests have shown that the technical concept can be used for the envisaged areas of application, from which specific challenges can be derived that are addressed in this paper.
  •  
9.
  • Samuelsson-Gamboa, Mafalda, 1988, et al. (author)
  • Ritual drones: Designing and studying critical flying companions
  • 2021
  • In: ACM/IEEE International Conference on Human-Robot Interaction. - New York, NY, USA : ACM. - 2167-2148. ; , s. 562-564
  • Conference paper (peer-reviewed)abstract
    • Through a critical design approach, I suggest new perspectives on social drones, particularly companion drones. Supported by philosophies such as slow technology, I propose the design of anti-solutionist ritual drones and the study of their impact on the lives of users, particularly in domestic contexts. I intend to fill some of the methodological gaps identified, such as longitudinal studies in drone user experience through ethnography and auto-ethnography. I propose a "Research through Design"process of custom domestic probes for children and their families.
  •  
10.
  • Wozniak, Maciej K., et al. (author)
  • Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
  • 2024
  • In: HRI 2024 Companion - Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction. - : Association for Computing Machinery (ACM). - 2167-2148. - 9798400703232 ; , s. 1361-1363
  • Conference paper (peer-reviewed)abstract
    • The 7th International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI) seeks to bring together researchers from human-robot interaction (HRI), robotics, and mixed reality (MR) to address the challenges related to mixed reality interactions between humans and robots. Key topics include the development of robots capable of interacting with humans in mixed reality, the use of virtual reality for creating interactive robots, designing augmented reality interfaces for communication between humans and robots, exploring mixed reality interfaces for enhancing robot learning, comparative analysis of the capabilities and perceptions of robots and virtual agents, and sharing best design practices. VAM-HRI 2024 will build on the success of VAM-HRI workshops held from 2018 to 2023, advancing research in this specialized community. This year's website is located at https://vamhri.github.io.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-10 of 10

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view