Sluÿters, Arthur
[UCL]
Radar sensors have many advantages compared to traditional vision-based sensors for gesture recognition. They work in poor lighting and weather conditions, come with less privacy concerns than vision-based sensors, and can be integrated into everyday objects. However, due to the size and complexity of radar data, most radar-based systems rely on deep-learning techniques for gesture recognition. These systems take time to train and make it challenging to support quick customization by the user, such as changing the gesture set of an application. In this thesis, we want to investigate whether we can create efficient and customizable radar-based gesture interfaces, by reducing the size of raw radar data and relying on simple template matching algorithms for gesture recognition. We have already implemented a pipeline that handles these steps and tested its performance on a dataset of 20 gestures performed by three participants in front of a cheap, off-the-shelf FMCW radar. The next steps include developing a software environment for testing recognition techniques on radar gestures, optimizing our pipeline for real-time gesture recognition, and investigating two new use cases: environments where the radar is obstructed by some materials (wood, glass, and PVC) and breathing patterns recognition.
Bibliographic reference |
Sluÿters, Arthur. Designing Efficient and Customizable Radar-based Gesture Interfaces.28th International Conference on Intelligent User Interfaces (Sydney, NSW, Australia, du 27/03/2023 au 31/03/2023). In: Companion Proceedings of the 28th International Conference on Intelligent User Interfaces, Association for Computing Machinery : New York, NY, USA2023, p. 266 |
Permanent URL |
http://hdl.handle.net/2078.1/274033 |