Beyond Voice: Hacking Alexa with Multisensory I/O
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Saudi Digital Library
Abstract
Recently, researchers have been working on how to build inclusive technology for pupils with visual impairment in schools. As voice interaction offers a common modality for both sighted and visually impaired pupils, this project seeks to develop and assess the potential of multisensory feedback technology in encouraging greater engagement and inclusive learning. In particular, it aims to augment an Alexa quiz application with a physical multimodal feedback system that will help encourage collaborative learning in mixed ability classroom settings by making learning more fun and engaging. In doing so, this project will test if providing feedback in interesting ways will enrich the learning experience and increase engagement.
The idea is to create a physical controller with, visual, tactile, auditory and olfactory display, i.e. using lights, touch, sound and smell to engage users in learning activities around an Amazon Echo (Alexa). The activity will be a quiz game, used as revision aid, where Alexa asks questions and pupils answer using their controllers to get feedback depending if the answer is correct/incorrect, their overall score, speed and other interesting aspects we wish to discover in the co-design activity.
The core objective of this project is to design and build a device that gives multimodal feedback, in a revision session, to mixed ability pupils. And an analysis of six user studies on engagement and interaction as a main deliverable.
This project will achieve the application of multisensory feedback interaction to a novel domain (Voice User Interfaces). If successful, it will show the utilization of sensory feedback as a means of engaging visually impaired pupils in learning activities and should have the added value of increasing collaborative learning and inclusion between them and their sighted peers.
This work will also provide a solid case study to test the hypothesis that the use of fun feedback interaction can help better sustain pupil engagement. The user studies should also provide insight into which display modality pupils prefer the most. The main contribution of this project will be the novel introduction of multisensory feedback interaction to the domain of Voice User Interfaces through the development of a physical device. A second contribution is a comparative study exploring how different interfaces (audio vs tangible) impact engagement.