SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Feix Thomas) "

Sökning: WFRF:(Feix Thomas)

  • Resultat 1-4 av 4
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Feix, Thomas, et al. (författare)
  • A Metric for Comparing the Anthropomorphic Motion Capability of Artificial Hands
  • 2013
  • Ingår i: IEEE Transactions on robotics. - 1552-3098 .- 1941-0468. ; 29:1, s. 82-93
  • Tidskriftsartikel (refereegranskat)abstract
    • We propose a metric for comparing the anthropomorphic motion capability of robotic and prosthetic hands. The metric is based on the evaluation of how many different postures or configurations a hand can perform by studying the reachable set of fingertip poses. To define a benchmark for comparison, we first generate data with human subjects based on an extensive grasp taxonomy. We then develop a methodology for comparison using generative, nonlinear dimensionality reduction techniques. We assess the performance of different hands with respect to the human hand and with respect to each other. The method can be used to compare other types of kinematic structures.
  •  
2.
  • Feix, Thomas, et al. (författare)
  • The GRASP Taxonomy of Human Grasp Types
  • 2016
  • Ingår i: IEEE Transactions on Human-Machine Systems. - : IEEE. - 2168-2291 .- 2168-2305. ; 46:1, s. 66-77
  • Tidskriftsartikel (refereegranskat)abstract
    • In this paper, we analyze and compare existing human grasp taxonomies and synthesize them into a single new taxonomy (dubbed "The GRASP Taxonomy" after the GRASP project funded by the European Commission). We consider only static and stable grasps performed by one hand. The goal is to extract the largest set of different grasps that were referenced in the literature and arrange them in a systematic way. The taxonomy provides a common terminology to define human hand configurations and is important in many domains such as human-computer interaction and tangible user interfaces where an understanding of the human is basis for a proper interface. Overall, 33 different grasp types are found and arranged into the GRASP taxonomy. Within the taxonomy, grasps are arranged according to 1) opposition type, 2) the virtual finger assignments, 3) type in terms of power, precision, or intermediate grasp, and 4) the position of the thumb. The resulting taxonomy incorporates all grasps found in the reviewed taxonomies that complied with the grasp definition. We also show that due to the nature of the classification, the 33 grasp types might be reduced to a set of 17 more general grasps if only the hand configuration is considered without the object shape/size.
  •  
3.
  • Romero, Javier, et al. (författare)
  • Extracting Postural Synergies for Robotic Grasping
  • 2013
  • Ingår i: IEEE Transactions on robotics. - 1552-3098 .- 1941-0468. ; 29:6, s. 1342-1352
  • Tidskriftsartikel (refereegranskat)abstract
    • We address the problem of representing and encoding human hand motion data using nonlinear dimensionality reduction methods. We build our work on the notion of postural synergies being typically based on a linear embedding of the data. In addition to addressing the encoding of postural synergies using nonlinear methods, we relate our work to control strategies of combined reaching and grasping movements. We show the drawbacks of the (commonly made) causality assumption and propose methods that model the data as being generated from an inferred latent manifold to cope with the problem. Another important contribution is a thorough analysis of the parameters used in the employed dimensionality reduction techniques. Finally, we provide an experimental evaluation that shows how the proposed methods outperform the standard techniques, both in terms of recognition and generation of motion patterns.
  •  
4.
  • Romero, Javier, et al. (författare)
  • Spatio-Temporal Modeling of Grasping Actions
  • 2010
  • Ingår i: IEEE/RSJ 2010 INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2010). - 9781424466757 ; , s. 2103-2108
  • Konferensbidrag (refereegranskat)abstract
    • Understanding the spatial dimensionality and temporal context of human hand actions can provide representations for programming grasping actions in robots and inspire design of new robotic and prosthetic hands. The natural representation of human hand motion has high dimensionality. For specific activities such as handling and grasping of objects, the commonly observed hand motions lie on a lower-dimensional non-linear manifold in hand posture space. Although full body human motion is well studied within Computer Vision and Biomechanics, there is very little work on the analysis of hand motion with nonlinear dimensionality reduction techniques. In this paper we use Gaussian Process Latent Variable Models (GPLVMs) to model the lower dimensional manifold of human hand motions during object grasping. We show how the technique can be used to embed high-dimensional grasping actions in a lower-dimensional space suitable for modeling, recognition and mapping.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-4 av 4

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy