SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Liu Sichao) srt2:(2024)"

Sökning: WFRF:(Liu Sichao) > (2024)

  • Resultat 1-3 av 3
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Liu, Sichao, et al. (författare)
  • Cognitive neuroscience and robotics : Advancements and future research directions
  • 2024
  • Ingår i: Robotics and Computer-Integrated Manufacturing. - : Elsevier BV. - 0736-5845 .- 1879-2537. ; 85
  • Forskningsöversikt (refereegranskat)abstract
    • In recent years, brain-based technologies that capitalise on human abilities to facilitate human–system/robot interactions have been actively explored, especially in brain robotics. Brain–computer interfaces, as applications of this conception, have set a path to convert neural activities recorded by sensors from the human scalp via electroencephalography into valid commands for robot control and task execution. Thanks to the advancement of sensor technologies, non-invasive and invasive sensor headsets have been designed and developed to achieve stable recording of brainwave signals. However, robust and accurate extraction and interpretation of brain signals in brain robotics are critical to reliable task-oriented and opportunistic applications such as brainwave-controlled robotic interactions. In response to this need, pervasive technologies and advanced analytical approaches to translating and merging critical brain functions, behaviours, tasks, and environmental information have been a focus in brain-controlled robotic applications. These methods are composed of signal processing, feature extraction, representation of neural activities, command conversion and robot control. Artificial intelligence algorithms, especially deep learning, are used for the classification, recognition, and identification of patterns and intent underlying brainwaves as a form of electroencephalography. Within the context, this paper provides a comprehensive review of the past and the current status at the intersection of robotics, neuroscience, and artificial intelligence and highlights future research directions.
  •  
2.
  • Yi, Shuming, et al. (författare)
  • Safety-aware human-centric collaborative assembly
  • 2024
  • Ingår i: Advanced Engineering Informatics. - : Elsevier BV. - 1474-0346 .- 1873-5320. ; 60
  • Tidskriftsartikel (refereegranskat)abstract
    • Manufacturing systems envisioned for factories of the future will promote human-centricity for close collaboration in a shared working environment towards better overall productivity within the context of Industry 5.0. Robust and accurate recognition and prediction of human intentions are crucial to reliable and safe collaborative operations between humans and robots. For this purpose, this paper proposed a safety-aware human-centric collaborative assembly approach driven by function blocks, human action recognition for intention detection, and collision avoidance for safe robot control. Within the context, a deep learning-based recognition system is developed for high-accuracy human intention recognition and prediction, and an assembly feature-based approach driven by function blocks is presented for assembly execution and control. Thus, assembly features and human behaviours during assembly are formulated to support safe assembly actions. Skeleton-based human behaviours are defined as control inputs to an adaptive safety-aware scheme. The scheme includes collaborative and parallel mode-based pre-warning and obstacle avoidance approaches for a human-centric collaborative assembly system. The former is to monitor and regulate robot control modes when working in parallel with humans, and the latter uses a position-based approach to control robot actions by adaptively adjusting obstacle avoidance trajectories in a dynamic collaborative environment. The findings of this paper reveal the effectiveness of the developed system, as experimentally validated through an engine-assembly case study.
  •  
3.
  • Zhang, Yaqian, et al. (författare)
  • Skeleton-RGB integrated highly similar human action prediction in human–robot collaborative assembly
  • 2024
  • Ingår i: Robotics and Computer-Integrated Manufacturing. - : Elsevier BV. - 0736-5845 .- 1879-2537. ; 86
  • Tidskriftsartikel (refereegranskat)abstract
    • Human–robot collaborative assembly (HRCA) combines the flexibility and adaptability of humans with the efficiency and reliability of robots during collaborative assembly operations, which facilitates complex product assembly in the mass personalisation paradigm. The cognitive ability of robots to recognise and predict human actions and make responses accordingly is essential but currently still limited, especially when facing highly similar human actions. To improve the cognitive ability of robots in HRCA, firstly, a two-stage skeleton-RGB integrated model focusing on human-parts interaction is proposed to recognise highly similar human actions. Specifically, it consists of a feature guidance module and a feature fusion module, which can balance the accuracy and efficiency of human action recognition. Secondly, an online prediction approach is developed to predict human actions ahead of schedule, which includes a pre-trained skeleton-RGB integrated model and a preprocessing module. Thirdly, considering the positioning accuracy of the parts to be assembled and the continuous update of human actions, a dynamic response scheme of the robot is designed. Finally, the feasibility and effectiveness of the proposed model and approach are verified by a case study of a worm-gear decelerator assembly. The experimental results demonstrate that the proposed model achieves precise human action recognition with a high accuracy of 93.75% and a lower computational cost. Specifically, only 15 frames from a skeleton stream and 5 frames (less than 16 frames in general) from an RGB video stream are adopted. Moreover, it only takes 1.026 s to achieve online human action prediction based on the proposed prediction method. The dynamic response scheme of the robot is also proven to be feasible. It is expected that the efficiency of human–robot interaction in HRCA can be improved from a closed-loop view of perception, prediction, and response.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-3 av 3

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy