Description:
Multimodal Integration -- Cross-Modality Matching of Loudness and Perceived Intensity of Whole-Body Vibrations -- Leaping across Modalities: Speed Regulation Messages in Audio and Tactile Domains -- The Effect of Spatial Disparity on the Integration of Auditory and Tactile Information -- Parametric Study of Virtual Curvature Recognition: Discrimination Thresholds for Haptic and Visual Sensory Information -- Cross-Modal Frequency Matching: Sound and Whole-Body Vibration -- Tactile and Sonic Explorations -- Audioworld: A Spatial Audio Tool for Acoustic and Cognitive Learning -- Exploring Interactive Systems Using Peripheral Sounds -- Basic Exploration of Narration and Performativity for Sounding Interactive Commodities -- Tactile Web Browsing for Blind Users -- Reducing Reversal Errors in Localizing the Source of Sound in Virtual Environment without Head Tracking -- Walking and Navigation Interfaces -- Conflicting Audio-haptic Feedback in Physically Based Simulation of Walking Sounds -- The Influence of Angle Size in Navigation Applications Using Pointing Gestures -- Audio-tactile Display of Ground Properties Using Interactive Shoes -- Efficient Acquisition of Force Data in Interactive Shoe Designs -- A Comparison of Two Wearable Tactile Interfaces with a Complementary Display in Two Orientations -- Prototype Design and Evaluation -- Virtual Sequencing with a Tactile Feedback Device -- The LapSlapper - Feel the Beat -- Product Design Review Application Based on a Vision-Sound-Haptic Interface -- The Phantom versus the Falcon: Force Feedback Magnitude Effects on User’s Performance during Target Acquisition -- Gestures and Emotions -- Building a Framework for Communication of Emotional State through Interaction with Haptic Devices -- A Trajectory-Based Approach for Device Independent Gesture Recognition in Multimodal User Interfaces.