Inicio Nosotros Búsquedas
Buscar en nuestra Base de Datos:     
Autor: =Cohen, Philip
2 registros cumplieron la condición especificada en la base de información BIBCYT. ()
Registro 1 de 2, Base de información BIBCYT
Publicación seriada
Referencias AnalíticasReferencias Analíticas
Autor: Oviatt, Sharon oviatt@cse.ogi.edu
Oprima aquí para enviar un correo electrónico a esta dirección ; Cohen, Philip pcohen@cse.ogi.edu
Oprima aquí para enviar un correo electrónico a esta dirección
Título: Multimodel Interfaces That Pprocess What Comes Naturally.
Páginas/Colación: pp. 45-53
Communications of the ACM Vol. 43, no. 3 March 2000
Información de existenciaInformación de existencia

Resumen
The article focuses on various multimodal interfaces

The article focuses on various multimodal interfaces. During multimodal communication, one speaks, shifts eye gaze, gesture and moves in a powerful flow of communication that bears little resemblance to the discrete keyboard and mouse clicks entered sequentially with a graphical user interface. A profound shift is now occurring toward embracing users' natural behavior as the center of the human-computer interface. Multimodal interfaces are being developed that permit highly skilled and coordinated communicative behavior to control system interactions in a more transparent experience than ever before. Human voice, hands and the entire body, once augmented by sensors such as microphones and cameras, are becoming the ultimate transparent and mobile multimodal input devices. The area of multimodal systems has expanded rapidly during the past five years. State-of-the art multimodal speech and gesture systems now process complex gestural input other than pointing, and new systems have been extended to process different mode combinations

Registro 2 de 2, Base de información BIBCYT
Publicación seriada
Referencias AnalíticasReferencias Analíticas
Autor: Cohen, Philip pcohen@cse.ogi.edu
Oprima aquí para enviar un correo electrónico a esta dirección ; McGee, David R. dmcgee@naturalinteraction.com
Oprima aquí para enviar un correo electrónico a esta dirección
Título: Tangible Multimodal Interface For Safety-Critical Appliocations
Páginas/Colación: pp.41-46.; 28 cm.; il
Communications of the ACM Vol. 47, no. 1 January 2004
Información de existenciaInformación de existencia

Resumen
The article reports that tangible user interfaces (TUIs) incorporate physical objects as sensors and effectors that, when manipulated, modify computational behavior. To enable the construction of TUIs, systems distinguish and identify physical objects, determine their location, orientation, or other aspects of their physical state, support annotations on them, and associate them with different computational states. To do this, TUIs use technologies such as radio emitters, bar codes, or computer vision. The first tangible paper system, the DigitalDesk, incorporates paper via computer vision. The DigitalDesk could copy and paste printed text or numbers from paper into digital documents, enabling the user to manipulate the information electronically. The article also describes that with flexible multimodal interfaces users can take advantage of more than one of their natural communication modes during human-computer interaction, selecting the best mode or combination of modes that suit their situation and task.

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

UCLA - Biblioteca de Ciencias y Tecnologia Felix Morales Bueno

Generados por el servidor 'bibcyt.ucla.edu.ve' (18.219.189.247)
Adaptive Server Anywhere (07.00.0000)
ODBC
Sesión="" Sesión anterior=""
ejecutando Back-end Alejandría BE 7.0.7b0 ** * *
18.219.189.247 (NTM) bajo el ambiente Apache/2.2.4 (Win32) PHP/5.2.2.
usando una conexión ODBC (RowCount) al manejador de bases de datos..
Versión de la base de información BIBCYT: 7.0.0 (con listas invertidas [2.0])

Cliente: 18.219.189.247
Salida con Javascript


** Back-end Alejandría BE 7.0.7b0 *