With ordinary interface devices (e.g. keyboard, mouse, touch screen and ultrasonic pen), the interaction of humans with computers is restricted to a particular device at a certain location within a small movement area. A challenge in human computer interaction research is to create tangible interfaces that will make the interaction possible via augmented physical surfaces, graspable objects and ambient media (e.g. wall, tabletop and air), as well as making the interaction natural without the need for a hand held device.
In the EU funded project “Tangible Acoustic Interfaces for Computer Human Interaction” (Tai-Chi), acoustics-based remote sensing technology is utilised since vibrations are the natural outcome of an interaction and propagate well in most solid materials. This means that the information pertaining to an interaction can be conveyed to a remote location (sensor) using the structure of the object itself as a transmission channel and therefore suppressing the need for an overlay or any other intrusive device over the area one wishes to make sensitive. The advantage of this new sensing paradigm over other methods of interaction such as computer vision or speech recognition suggests a strong potential for the computer industry. New applications can include wall-size touch panels, three-dimensional interfaces and robust interactive screens for harsh environments. The acoustics-based sensing used in Tai-Chi is the most promising way for human computer interaction, bringing another of the five human senses into the realm of computers: the sense of touch.
The ultimate goal of the Tai-Chi project is to develop acoustics-based remote sensing technology which can be adapted to virtually any physical object to create tangible interfaces, allowing the user to communicate freely with a computer, an interactive system or the cyber-world by means of an arbitrary object from the environment. The methods for contact point localisation developed in Tai-Chi utilise the location-signature embedded in the acoustic wave patterns caused by contact, as well as triangulation and acoustic holography.
The Tai-Chi project has involved the fundamental study of acoustic physical properties, advanced theoretical development of localisation algorithms, hardware structure design and demonstrations at public events. New results have been achieved for localising impacts (finger tapping, nail clicking and knocking) and tracking continuous movement (scratching) on large surfaces. Various localisation approaches have been thoroughly investigated, mainly, Time Delay of Arrival, Location Pattern Matching and Acoustic Holography with high precision attained using enhanced filtering techniques. In-air localisation has also been studied and tested using Time-Delay and Doppler-Shift to track a moving acoustic source in the air. Experiments were successfully conducted with different objects such as wood boards, window glass, plastic blocks and metallic sheets. In the final stages of the project, work is being carried out to produce a Tai-Chi Developer’s Tool Kit (TDK) comprising sensors, signal conditioning circuitry, a DSP unit, a PC interface together with a library of algorithms. Several papers on this work have been presented at conferences and published in journals.
Among the application demonstrators developed by Tai-Chi partners are: a Virtual Piano (a piano keyboard projected on white board that produces the relevant note when touched), a Memory Game (cards projected on plastic sheet), a giant tablet (that enables curve drawing by continuous movement of finger on a large wood board), a music chair and an interactive map interfaced to Google Earth.
[text pasted from TAI CHI's website]
Thursday, 13 December 2007
Tangible Acustic Interfaces: the TAI CHI project
Posted by crawlingbug at 11:01
Labels: natural interaction
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment