SwePub
Sök i LIBRIS databas

  Extended search

onr:"swepub:oai:DiVA.org:kth-82188"
 

Search: onr:"swepub:oai:DiVA.org:kth-82188" > Visual Servoing for...

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Visual Servoing for Manipulation: Robustness and Integration Issues

Kragic, Danica (author)
KTH,Numerisk analys och datalogi, NADA
 (creator_code:org_t)
2001
2001
English.
In: IEEE transactions on robotics and automation. - 1042-296X. ; , s. 230-
  • Journal article (peer-reviewed)
Abstract Subject headings
Close  
  • Service robots are gradually extended to operation in everyday environments. To be truly useful, a mobile robot should include facilities for interaction with the envi- ronment, in particular methods for manipulation of objects. One of the most flexible sensory modalities to enable this is computational vision. In this thesis the issue of visual servoing and grasping to facilitate such interaction is investigated. A notorious problem for use of vision in natural environments is robustness with respect to variations in the environment. It is also well-known that no single technique is suitable for different tasks a robot is supposed to perform. Robustness is here investigated using several different approaches. The issues of variability are formulated with respect to visual features, the number of cameras used, and task constraints. It is argued that integration of methods facilitate construction of more robust visual servoing systems for realistic tasks. Traditionally, fusion of visual information has been based on explicit models for uncertainty and integration. The most dominating technique has been use of Bayesian statistics, where strong models are employed. Where a large number of visual features are available it is suggested that it might be possible to perform tasks such as visual tracking using weak models for integration. In particular, integration using voting based methods is analyzed. If the object to be manipulated is known or has been recognized, it is possible to use explicit geometric models to facilitate the estimation of its pose. Consequently, a methodology for tracking of objects using wire-frame models has been developed and evaluated in the context of grasping. Visual servoing can be carried out in the image domain and/or using 3D in- formation. In this context, the tradeoff between explicit models and use of multiple cameras strongly influences the performance of a visual servoing system. The relation between visual features, the number of cameras and their placement has been studied to provide guidelines for a design of such a system. An integration of a multi-ocular vision system, suitable visual techniques and task constraints facilitate flexible manipulation of everyday objects. To demonstrate this the developed techniques have been evaluated in the context of manipulation for opening/closing of doors in an everyday setting. In addition, it is demonstrated how the techniques, together with model based information, may be used for grasping and grasp monitoring in the context of a well-known set of objects. In summary, a toolkit for interaction with everyday objects has been investigated and evaluated for real-world tasks. The developed methods provide a rich basis for real-world manipulation of objects in everyday settings.

Subject headings

NATURVETENSKAP  -- Data- och informationsvetenskap (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences (hsv//eng)

Publication and Content Type

ref (subject category)
art (subject category)

Find in a library

To the university's database

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Find more in SwePub

By the author/editor
Kragic, Danica
About the subject
NATURAL SCIENCES
NATURAL SCIENCES
and Computer and Inf ...
Articles in the publication
IEEE transaction ...
By the university
Royal Institute of Technology

Search outside SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view