Exploring the Mobile Phone as a Platform for a Portable, Non-Invasive, and Robust Sign Language Recognition System

Date
2017
Journal Title
Journal ISSN
Volume Title
Publisher
Producer
Director
Performer
Choreographer
Costume Designer
Music
Videographer
Lighting Designer
Set Designer
Crew Member
Funder
Rehearsal Director
Concert Coordinator
Moderator
Panelist
Alternative Title
Department
Haverford College. Department of Computer Science
Type
Thesis
Original Format
Running Time
File Format
Place of Publication
Date Span
Copyright Date
Award
Language
eng
Note
Table of Contents
Terms of Use
Rights Holder
Access Restrictions
Open Access
Tripod URL
Identifier
Abstract
This thesis explores the possibility of creating a portable, non-invasive, and robust sign language recognition system on a mobile device, capable of improving the quality of life of sign language users. A gesture recognition app was implemented for iOS in Objective-C and C++, using the OpenCV library for computer vision. This app was compared to the Thalmic Labs Myo, which has been successfully used as a sign language recognition platform. Under laboratory conditions, the mobile phone app reached a 96.83% overall classification rate, while the Myo reached a 94.44% classification rate. Despite this accuracy rate for the app, there are severe limitations that make the mobile platform unviable in its current state. However, its promising best-case results suggest that if those limitations can be resolved, then the mobile platform could be as effective as the Myo, and therefore an effective SLR tool.
Description
Subjects
Citation
Collections