A modality separation approach for facial sketch synthesis

Autor: Kangning Du, Chaoyi Wang, Lin Cao, Yanan Guo, Wenwen Sun
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: IET Image Processing, Vol 18, Iss 13, Pp 4127-4140 (2024)
Druh dokumentu: article
ISSN: 1751-9667
1751-9659
DOI: 10.1049/ipr2.13238
Popis: Abstract The technology for face‐to‐sketch synthesis transforms optical face images into a sketch‐style format. However, traditional style losses are insufficient to discern the modal differences between optical and sketch domain images, leading to unclear images. At the same time, generated images lack clarity due to traditional approaches' disregard for high‐frequency texture. To address these issues, a modality separation approach for facial sketch synthesis is proposed. First, a modality separation structure is proposed, using a quicksort algorithm to merge features of optical and sketch images as target modality (positive samples), ensuring the generated images' feature distribution matches real sketches. By controlling the Euclidean distance between generated images (anchors) and both target and filtered modality (positive and negative samples), irrelevant information is effectively filtered out. Next, an edge‐promoting module feeds processed blurry sketch images into the discriminator to enhance robustness. Lastly, a detail optimization module uses Laplacian filtering to extract high‐frequency texture from optical face images for local enhancement. Experimental validation on CUHK, AR, and XM2VTS datasets shows that this method outperforms mainstream sketch face synthesis methods in terms of Fréchet inception distance and learned perceptual image patch similarity, producing more realistic and natural images with richer texture details.
Databáze: Directory of Open Access Journals