Automatic detection of surgical instruments' state in laparoscopic video images using neural networks

Autor: Martín Vicario, Celia, Oropesa García, Ignacio, Sánchez Margallo, Juan Antonio, Sánchez Margallo, Francisco Miguel, Gómez Aguilera, Enrique J., Sánchez González, Patricia
Jazyk: angličtina
Rok vydání: 2017
Předmět:
Zdroj: Libro de Actas del XXXV Congreso de la Sociedad Española de Ingeniería Biomédica (CASEIB 2017) | XXXV Congreso de la Sociedad Española de Ingeniería Biomédica (CASEIB 2017) | 29/11/2017-01/12/2017 | Bilbao, España
Archivo Digital UPM
Universidad Politécnica de Madrid
Popis: Software-based solutions such as virtual reality simulators and serious games can be useful assets for training minimally invasive surgery technical skills. However, their high cost and lack of realism/fidelity can sometimes be a drawback for their incorporation in training facilities. In this sense, the hardware interface plays an important role as the physical connection between the learner and the virtual world. The EVA Tracking System, provides computer vision-based information about the position and the orientation of the instruments in an expensive and unobtrusive manner, but lacks information about the aperture state of the clamps, which limits the system¿s functionalities. This article presents a new solution for instrument¿s aperture state detection using artificial vision and machine learning techniques. To achieve this goal, videos in a laparoscopic training box are recorded to obtain a data set. In each frame, the instrument clamp is segmented in a region of interest by means of color markers. The classifier is modeled using an Artificial Neural Network. The trained prediction model obtains accuracy results of 94% in the validation dataset and an error of 6% in independent evaluation video sequences. Results show that the model provides a competent solution to clamp¿s aperture state detection. Future works will address the integration of the model into the EVA and a virtual environment, the KTS serious game.
Databáze: OpenAIRE