Deep learning for biomechanical modeling of facial tissue deformation in orthognathic surgical planning

Autor: Nathan Lampen, Daeseung Kim, Xi Fang, Xuanang Xu, Tianshu Kuang, Hannah H. Deng, Joshua C. Barber, Jamie Gateno, James Xia, Pingkun Yan
Rok vydání: 2022
Předmět:
Zdroj: International journal of computer assisted radiology and surgery. 17(5)
ISSN: 1861-6429
Popis: Orthognathic surgery requires an accurate surgical plan of how bony segments are moved and how the face passively responds to the bony movement. Currently, finite element method (FEM) is the standard for predicting facial deformation. Deep learning models have recently been used to approximate FEM because of their faster simulation speed. However, current solutions are not compatible with detailed facial meshes and often do not explicitly provide the network with known boundary type information. Therefore, the purpose of this proof-of-concept study is to develop a biomechanics-informed deep neural network that accepts point cloud data and explicit boundary types as inputs to the network for fast prediction of soft-tissue deformation.A deep learning network was developed based on the PointNet++ architecture. The network accepts the starting facial mesh, input displacement, and explicit boundary type information and predicts the final facial mesh deformation.We trained and tested our deep learning model on datasets created from FEM simulations of facial meshes. Our model achieved a mean error between 0.159 and 0.642 mm on five subjects. Including explicit boundary types had mixed results, improving performance in simulations with large deformations but decreasing performance in simulations with small deformations. These results suggest that including explicit boundary types may not be necessary to improve network performance.Our deep learning method can approximate FEM for facial change prediction in orthognathic surgical planning by accepting geometrically detailed meshes and explicit boundary types while significantly reducing simulation time.
Databáze: OpenAIRE