Loading…
Enhancing Gesture Perception in Learning for Prosthetic Vision with Emerging Event Cameras
Gestures are powerful tools for enhancing student learning and facilitating human-computer interaction in digital learning environments. However, students with blindness or severe vision impairment are unable to benefit from gestures, resulting in inequitable learning conditions where they rely sole...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Gestures are powerful tools for enhancing student learning and facilitating human-computer interaction in digital learning environments. However, students with blindness or severe vision impairment are unable to benefit from gestures, resulting in inequitable learning conditions where they rely solely on passive listening. To address this issue, we propose the utilization of event cameras, which are novel image sensors capable of capturing the "essence" of motion, including gestures. By harnessing the power of event cameras, we can focus on the motion dynamics of gestures that are often inadequately represented in traditional videos. This potentially enables visually impaired students to have a better understanding and improved interaction with gesture-based content. In our approach, we leverage prosthetic vision to enable gesture perception for visually impaired students in learning. Our approach consists of several steps, including gesture event capture, event denoising, and prosthetic vision simulation, to facilitate accurate gesture recognition for effective learning. Through evaluation, we demonstrate the effectiveness of utilizing event information to assist gesture perception for visually impaired students. This research opens up new possibilities for inclusive education by utilizing emerging event cameras to provide equitable learning experiences for all students, regardless of their visual impairment. |
---|---|
ISSN: | 2161-377X |
DOI: | 10.1109/ICALT61570.2024.00052 |