In the classical image-based visual servoing framework, error signals are directly computed from image feature parameters, allowing, in principle, control schemes to be obtained that need neither a complete three-dimensional (3D) model of the scene nor a perfect camera calibration. However, when the computation of control signals involves the interaction matrix, the current value of some 3D parameters is requiredfor each considered feature, and typically a rough approximation of this value is used. With reference to the case of a point feature, for which the relevant 3D parameter is the depth Z, we propose a visual servoing approach where Z is observed and made available for servoing. This is achieved by interpreting depth as an unmeasurable state with known dynamics, and by building a non-linear observer that asymptotically recovers the actual value of Z for the selected feature. A byproduct of our analysis is the rigorous characterization of camera motions that actually allow such observation. Moreover, in the case of a partially uncalibrated camera, it is possible to exploit complementary camera motions in order to preliminarily estimate the focal length without knowing Z. Simulations and experimental results are presented for a mobile robot with an on-board camera in order to illustrate the benefits of integrating the depth observation within classical visual servoing schemes. © SAGE Publications 2008 Los Angeles.

Feature depth observation for image-based visual servoing: Theory and experiments / DE LUCA, Alessandro; Oriolo, Giuseppe; P., Robuffo Giordano. - In: THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH. - ISSN 0278-3649. - STAMPA. - 27:10(2008), pp. 1093-1116. [10.1177/0278364908096706]

Feature depth observation for image-based visual servoing: Theory and experiments

DE LUCA, Alessandro;ORIOLO, Giuseppe;
2008

Abstract

In the classical image-based visual servoing framework, error signals are directly computed from image feature parameters, allowing, in principle, control schemes to be obtained that need neither a complete three-dimensional (3D) model of the scene nor a perfect camera calibration. However, when the computation of control signals involves the interaction matrix, the current value of some 3D parameters is requiredfor each considered feature, and typically a rough approximation of this value is used. With reference to the case of a point feature, for which the relevant 3D parameter is the depth Z, we propose a visual servoing approach where Z is observed and made available for servoing. This is achieved by interpreting depth as an unmeasurable state with known dynamics, and by building a non-linear observer that asymptotically recovers the actual value of Z for the selected feature. A byproduct of our analysis is the rigorous characterization of camera motions that actually allow such observation. Moreover, in the case of a partially uncalibrated camera, it is possible to exploit complementary camera motions in order to preliminarily estimate the focal length without knowing Z. Simulations and experimental results are presented for a mobile robot with an on-board camera in order to illustrate the benefits of integrating the depth observation within classical visual servoing schemes. © SAGE Publications 2008 Los Angeles.
2008
depth observation; focal length observation; image-based visual servoing; mobile robots; nonlinear observers
01 Pubblicazione su rivista::01a Articolo in rivista
Feature depth observation for image-based visual servoing: Theory and experiments / DE LUCA, Alessandro; Oriolo, Giuseppe; P., Robuffo Giordano. - In: THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH. - ISSN 0278-3649. - STAMPA. - 27:10(2008), pp. 1093-1116. [10.1177/0278364908096706]
File allegati a questo prodotto
File Dimensione Formato  
VE_2008_11573-231206.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.09 MB
Formato Adobe PDF
2.09 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/231206
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 169
  • ???jsp.display-item.citation.isi??? 137
social impact