SwePub
Sök i LIBRIS databas

  Extended search

onr:"swepub:oai:DiVA.org:kth-341996"
 

Search: onr:"swepub:oai:DiVA.org:kth-341996" > A method of detecti...

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist
  • Liu, YixingKTH,Farkostteknik och Solidmekanik (author)

A method of detecting human movement intentions in real environments

  • Article/chapterEnglish2023

Publisher, publication year, extent ...

  • Institute of Electrical and Electronics Engineers (IEEE),2023
  • printrdacarrier

Numbers

  • LIBRIS-ID:oai:DiVA.org:kth-341996
  • https://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-341996URI
  • https://doi.org/10.1109/ICORR58425.2023.10304774DOI

Supplementary language notes

  • Language:English
  • Summary in:English

Part of subdatabase

Classification

  • Subject category:ref swepub-contenttype
  • Subject category:kon swepub-publicationtype

Notes

  • Part of proceedings ISBN: 979-8-3503-4275-8QC 20240109
  • Accurate and timely movement intention detection can facilitate exoskeleton control during transitions between different locomotion modes. Detecting movement intentions in real environments remains a challenge due to unavoidable environmental uncertainties. False movement intention detection may also induce risks of falling and general danger for exoskeleton users. To this end, in this study, we developed a method for detecting human movement intentions in real environments. The proposed method is capable of online self-correcting by implementing a decision fusion layer. Gaze data from an eye tracker and inertial measurement unit (IMU) signals were fused at the feature extraction level and used to predict movement intentions using 2 different methods. Images from the scene camera embedded on the eye tracker were used to identify terrains using a convolutional neural network. The decision fusion was made based on the predicted movement intentions and identified terrains. Four able-bodied participants wearing the eye tracker and 7 IMU sensors took part in the experiments to complete the tasks of level ground walking, ramp ascending, ramp descending, stairs ascending, and stair descending. The recorded experimental data were used to test the feasibility of the proposed method. An overall accuracy of 93.4% was achieved when both feature fusion and decision fusion were used. Fusing gaze data with IMU signals improved the prediction accuracy.

Subject headings and genre

Added entries (persons, corporate bodies, meetings, titles ...)

  • Wan, Zhao-YuanKTH,Farkostteknik och Solidmekanik(Swepub:kth)u1sqx25i (author)
  • Wang, RuoliKTH,Farkostteknik och Solidmekanik(Swepub:kth)u1nan6x1 (author)
  • Gutierrez-Farewik, Elena,1973-KTH,Farkostteknik och Solidmekanik(Swepub:kth)u1tekbf6 (author)
  • KTHFarkostteknik och Solidmekanik (creator_code:org_t)

Related titles

  • In:2023 international conference on rehabilitation robotics, ICORR: Institute of Electrical and Electronics Engineers (IEEE)

Internet link

To the university's database

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Find more in SwePub

By the author/editor
Liu, Yixing
Wan, Zhao-Yuan
Wang, Ruoli
Gutierrez-Farewi ...
About the subject
NATURAL SCIENCES
NATURAL SCIENCES
and Computer and Inf ...
and Computer Vision ...
Articles in the publication
By the university
Royal Institute of Technology

Search outside SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view