ABSTRACT

For most computer systems, even virtual reality systems, sensing techniques are a means of getting input directly from the user. However, wearable sensors and computers offer a unique opportunity to redirect sensing technology toward recovering more general user context. Wearable computers have the potential to “see” as the user sees, “hear” as the user hears, and experience the life of the user in a “first-person” sense. This increase in contextual and user information may lead to more intelligent and fluid interfaces that use the physical world as part of the interface.