SwePub
Sök i LIBRIS databas

  Utökad sökning

id:"swepub:oai:DiVA.org:oru-78358"
 

Sökning: id:"swepub:oai:DiVA.org:oru-78358" > Bi-directional navi...

Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human-robot interaction

Chadalavada, Ravi Teja, 1989- (författare)
Örebro universitet,Institutionen för naturvetenskap och teknik,Centre for Applied Autonomous Sensor Systems (AASS)
Andreasson, Henrik, 1977- (författare)
Örebro universitet,Institutionen för naturvetenskap och teknik,Centre for Applied Autonomous Sensor Systems (AASS)
Schindler, Maike (författare)
Faculty of Human Sciences, University of Cologne, Germany
visa fler...
Palm, Rainer, 1942- (författare)
Örebro universitet,Institutionen för naturvetenskap och teknik,Centre for Applied Autonomous Sensor Systems (AASS)
Lilienthal, Achim J., 1970- (författare)
Örebro universitet,Institutionen för naturvetenskap och teknik,Centre for Applied Autonomous Sensor Systems (AASS)
visa färre...
 (creator_code:org_t)
Elsevier, 2020
2020
Engelska.
Ingår i: Robotics and Computer-Integrated Manufacturing. - : Elsevier. - 0736-5845 .- 1879-2537. ; 61
  • Tidskriftsartikel (refereegranskat)
Abstract Ämnesord
Stäng  
  • Safety, legibility and efficiency are essential for autonomous mobile robots that interact with humans. A key factor in this respect is bi-directional communication of navigation intent, which we focus on in this article with a particular view on industrial logistic applications. In the direction robot-to-human, we study how a robot can communicate its navigation intent using Spatial Augmented Reality (SAR) such that humans can intuitively understand the robot's intention and feel safe in the vicinity of robots. We conducted experiments with an autonomous forklift that projects various patterns on the shared floor space to convey its navigation intentions. We analyzed trajectories and eye gaze patterns of humans while interacting with an autonomous forklift and carried out stimulated recall interviews (SRI) in order to identify desirable features for projection of robot intentions. In the direction human-to-robot, we argue that robots in human co-habited environments need human-aware task and motion planning to support safety and efficiency, ideally responding to people's motion intentions as soon as they can be inferred from human cues. Eye gaze can convey information about intentions beyond what can be inferred from the trajectory and head pose of a person. Hence, we propose eye-tracking glasses as safety equipment in industrial environments shared by humans and robots. In this work, we investigate the possibility of human-to-robot implicit intention transference solely from eye gaze data and evaluate how the observed eye gaze patterns of the participants relate to their navigation decisions. We again analyzed trajectories and eye gaze patterns of humans while interacting with an autonomous forklift for clues that could reveal direction intent. Our analysis shows that people primarily gazed on that side of the robot they ultimately decided to pass by. We discuss implications of these results and relate to a control approach that uses human gaze for early obstacle avoidance.

Ämnesord

NATURVETENSKAP  -- Data- och informationsvetenskap -- Datorseende och robotik (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences -- Computer Vision and Robotics (hsv//eng)

Nyckelord

Human-robot interaction (HRI)
Mobile robots
Intention communication
Eye-tracking
Intention recognition
Spatial augmented reality
Stimulated recall interview
Obstacle avoidance
Safety
Logistics

Publikations- och innehållstyp

ref (ämneskategori)
art (ämneskategori)

Hitta via bibliotek

Till lärosätets databas

Sök utanför SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy