Tag Archives: multimodal

multimodal_interaction_visualization_2.001 © 2015 admin. All rights reserved.

Multimodal Interaction Visualization

A paper presented on the Turn-Taking and Coordination in Human-Machine Interaction symposium (2015 AAAI Spring Symposia, Stanford University, March 23-25, 2015, Palo Alto, California). Long worked on how to visualize multimodal interaction. After several attempts, got a solution that found to be … Continue reading

multimodal_dialogue_visualization © 2012 admin. All rights reserved.

Visualization of multimodal interaction

Continuation to an earlier project. The goal was to discover how different modality channels contribute to the overall success of a dialogue (i.e. system-user interaction). The implementation takes an EMMA-like XML-based description of the dialogue and analyses within each dialogues for each … Continue reading

multimodal_integration_v2 © 2012 admin. All rights reserved.

Contextual multimodal integration

Multimodal integration is an essential part of any multimodal system. By multimodal input we refer to any explicit input modalities like speech, gesture, touch, etc. and implicit modalities like sensor inputs, contextual information and even bio signals. While multimodal interaction … Continue reading

ADEO_user_test © 2012 admin. All rights reserved.

User research on experience sharing

A fun project conducted in Palo Alto and Helsinki with 16 subjects about experience sharing practices. The learning: although many mothers have no social media presence, it does not mean they would not like to share their daily experience with … Continue reading

communication word clouds © 2012 admin. All rights reserved.

Future communication means

A strategic design project about technologies that will shape the ways of communication in the near future. Wearables, multimodal interaction, immersive display technologies, etc. As the final outcome is confidential, just putting here a cloud of words about the major … Continue reading

context_story2 © 2012 admin. All rights reserved.

Experience sharing with multimodal & multisensory UI

This work is a continuous effort based on learnings from a 2010 project at the California College of the Arts and a project in 2011 on multimodal and multisensory interaction (such as speech, gesture, physical activity and proximity sensing) to … Continue reading

pics for marzzippan v3.001 © 2012 admin. All rights reserved.

Multimodal media player

A small-scale project investigating multimodal interaction for a media player. Concept work, defining research steps and use of available resources. The main emphasis was on the selection of the most suitable input/output modalities and what it requires in terms of … Continue reading

visualisation of dialogues © 2011 admin. All rights reserved.

Visualization of dialogues

An old project with new tools. Back in 2000 published a paper on “Visualisation of spoken dialogues“. Since then tools for visualization developed significantly and the current effort focuses on multimodal interaction. More info available around May 2012. Figures: (c) 2012 … Continue reading