A multi-modal interactive tablet with tactile feedback, rear and lateral operation for maximum front screen visibility
Autor: | Shunsuke Ono, Itsuo Kumazawa, Souma Suzuki, Shu Yano |
---|---|
Rok vydání: | 2016 |
Předmět: |
InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.
HCI) Computer science business.industry 05 social sciences 020207 software engineering 02 engineering and technology Virtual reality Modal 0202 electrical engineering electronic engineering information engineering Immersion (virtual reality) 0501 psychology and cognitive sciences Computer vision Artificial intelligence business 050107 human factors Tactile sensor Simulation |
Zdroj: | VR |
DOI: | 10.1109/vr.2016.7504728 |
Popis: | When we use a tablet style handheld device such as a smart phone as a part of a virtual really system, its most outstanding future: the touch screen dominating the most area of the front face should be incorporated into the system effectively and beneficially. For example, the visual information displayed on the screen can be merged with the surrounding or background scenes and the intuitive touch operation can be performed in a suitable scenario. However, if the means of the interaction is limited to the touch operation, finger operation on the front screen must be performed even for unsuitable scenarios, and the fingers or hands occluding the visual information disturb our immersive experience. To deal with this situation, we propose a multi-modal interactive tablet that uses its cameras, accelerometer, track ball and pressure sensors implemented on its rear and side for operations ensuring visibility. The pressing and ball-rotating operation on the rear and the side and the tactile feedback generated by voice-coil-based actuators assist and guide the multi-modal interaction. The effectiveness of the multi-modality with the rear and the side operation and the tactile feedback is evaluated by an experiment. |
Databáze: | OpenAIRE |
Externí odkaz: |