Autor: |
Hosokawa, Yoichi, Miwa, Tetsushi, Hashimoto, Yoshihiro |
Jazyk: |
angličtina |
Rok vydání: |
2020 |
Předmět: |
|
Zdroj: |
Computers Helping People with Special Needs |
Popis: |
We propose TARS mobile applications that uses a smartphone with a camera and deep learning fingertip detector for easier implementation than using a PC or a touch panel. The app was designed to recognize the user’s hand touching the images with the rear camera and provide voice guidance with the information on the images that the index finger is touching as a trigger. When performing gestures with either the index finger or thumb, the app was able to detect and output the fingertip point without delay, and it was effective as a trigger for reading. Thumb gestures are assumed to have reduced detection variances of 68% in the lateral direction because they rarely move the other four fingers compared to index finger gestures. By performing multiple detections in the application and outputting the median, the variances of detection can be reduced to 73% in the lateral direction and 70% in the longitudinal direction, which shows the effectiveness of multiple detections. These techniques are effective in reducing the variance of fingertip detection. We also confirmed that if the tilt of the device is between −3.4 mm and 4 mm, the current app could identify a 12 mm difference with an accuracy of 85.5% as an average in both of the lateral and longitudinal directions. Finally, we developed a basic model of TARS mobile app that allows easier installation and more portability by using a smart phone camera rather than a PC or a touch panel. |
Databáze: |
OpenAIRE |
Externí odkaz: |
|