The project concerns the study of multimodal sensing and controlling of time-varying textures (micro-features of forms/structures/objects) in multimedia environments. The general aim is to develop a basic understanding of how textures from different perceptive domains, in particular, from the visual, sonic, and motion domain, are experientially linked with each other in multimedia environments. To achieve that goal, new devices for gesture-based control, sensing, and rendering of textures will be developed and integrated in a user environment. At different phases of its development, the user environment will be used for studying how people experience the relationship between textures from different domains. The project is interdisciplinary because it integrates a subject-driven approach (with a focus on usability and subjective experiences), and an object-driven approach (with a focus on hardware development and rendering technology).
The project focuses on the study of multimodal textures in multimedia environments. The general aim is to develop an understanding of how textures from different perceptive domains, namely, visual, sonic, motion, are experientially linked with each other in multimedia environments. Of particular interest in this study is the understanding of textures in a time-variant perspective, through gestural, and thus body-based, control.