In this paper we present a vision-based perceptive interface able to recognize some basic "user states" (presence, absence, head orientation and phoning). This recognition module can be used to implement attentive interfaces acting differently according to user behaviour. To efficiently work in real-time, the system exploits skin-colour detection, reliably identifying face and hands within captured frames. Colour-based techniques, however, have the problem of requiring precise calibration, which is usually a tedious and delicate task. We therefore propose a (semi-)automatic calibration procedure which frees the user from such a charge, totally or, if necessary, via a guided wizard. Our experiments demonstrate that this solution works well and greatly improves the propensity of users to employ a vision-based perceptive interface.

A Vision-Based Attentive User Interface with (Semi-)Automatic Parameter Calibration

PORTA, MARCO
2009-01-01

Abstract

In this paper we present a vision-based perceptive interface able to recognize some basic "user states" (presence, absence, head orientation and phoning). This recognition module can be used to implement attentive interfaces acting differently according to user behaviour. To efficiently work in real-time, the system exploits skin-colour detection, reliably identifying face and hands within captured frames. Colour-based techniques, however, have the problem of requiring precise calibration, which is usually a tedious and delicate task. We therefore propose a (semi-)automatic calibration procedure which frees the user from such a charge, totally or, if necessary, via a guided wizard. Our experiments demonstrate that this solution works well and greatly improves the propensity of users to employ a vision-based perceptive interface.
2009
9781605589862
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11571/148880
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact