A Multimodal Interaction Framework for Blended Learning

Journal Title: EAI Endorsed Transactions on Creative Technologies - Year 2017, Vol 4, Issue 10

Abstract

Humans interact with each other by utilizing the five basic senses as input modalities, whereas sounds, gestures, facial expressions etc. are utilized as output modalities. Multimodal interaction is also used between humans and their surrounding environment, although enhanced with further senses such as equilibrioception and the sense of balance. Computer interfaces that are considered as a different environment that human can interact with, lack of input and output amalgamation in order to provide a close to natural interaction. Multimodal human-computer interaction has sought to provide alternative means of communication with an application, which will be more natural than the traditional “windows, icons, menus, pointer” (WIMP) style. Despite the great amount of devices in existence, most applications make use of a very limited set of modalities, most notably speech and touch. This paper describes a multimodal framework enabling deployment of a vast variety of modalities, tailored appropriately for use in blended learning environment and introduces a unified and effective framework for multimodal interaction called COALS.

Authors and Affiliations

N. Vidakis

Keywords

Related Articles

Evaluating music performance and context-sensitivity with Immersive Virtual Environments

This study explores a unique experimental protocol that evaluates how a musician’s sensitivity to social context during performance can be analysed through a combination of behavioral analysis, self-report and Immersive...

An Autonomous and Distributed Mobility Management Scheme in Mobile Core Networks

The 5th generation mobile and wireless communication systems are expected to accommodate exploding traffic, increasing number of devices, and heterogeneous applications driven by proliferation of IoT and M2M technologies...

A Multimodal Interaction Framework for Blended Learning

Humans interact with each other by utilizing the five basic senses as input modalities, whereas sounds, gestures, facial expressions etc. are utilized as output modalities. Multimodal interaction is also used between hum...

The A-Z of Creative Technologies

This paper undertakes an initial critical analysis of Creative Technologies as a means to gain insight to the nature of this as an emerging field. The paper utilises an approach previously used in the design discipline t...

Improvement of natural image search engines results by emotional filtering

With the Internet 2.0 era, managing user emotions is a problem that more and more actors are interested in. Historically, the first notions of emotion sharing were expressed and defined with emoticons. They allowed users...

Download PDF file
  • EP ID EP45867
  • DOI http://dx.doi.org/10.4108/eai.4-9-2017.153057
  • Views 284
  • Downloads 0

How To Cite

N. Vidakis (2017). A Multimodal Interaction Framework for Blended Learning. EAI Endorsed Transactions on Creative Technologies, 4(10), -. https://europub.co.uk/articles/-A-45867