Implementation of wheelchair controller using mouth and tongue gesture
Autor: | Rafia Hassani, Mohamed Boumehraz, Maroua Hamzi |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2021 |
Předmět: |
Control and Optimization
InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g. HCI) Computer Networks and Communications Face detection Tongue gesture body regions Mouth gesture Hardware and Architecture Signal Processing Electrical and Electronic Engineering Powered wheelchair human activities Human machine interface Information Systems |
Popis: | In this paper, a simple human-machine interface allowing people with severe disabilities to control a motorized wheelchair using mouth and tongue gesture is presented. The development of the proposed system consists of three principal phases: the first phase is mouth detection which performed by using haar cascade to detect the face area and template matching to detect mouth and tongue gestures from the lower face region. The second phase is command extraction; it is carried by determining the mouth and tongue gesture commands according to the detected gesture, the time taken to execute the gestures, and the previous command which is stored in each frame. Finally, the gesture commands are sent to the wheelchair as instruction using the Bluetooth serial port. The hardware used for this project were; laptop with universal serial bus (USB) webcam as a vision-based control unit, Bluetooth module to receive instructions comes from the vision control unit, standard joystick used in case of emergency, joystick emulator which delivers to the control board signals similar to the signals that are usually generated by the standard joystick, and ultrasonic sensors to provide safe navigation. The experimental results showed the success of the proposed control system based on mouth and tongue gestures. |
Databáze: | OpenAIRE |
Externí odkaz: |