Autor: |
Xiaolin Zeng, Lei Cheng, Shanna Li, Xueping Liu |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
Heritage Science, Vol 12, Iss 1, Pp 1-20 (2024) |
Druh dokumentu: |
article |
ISSN: |
2050-7445 |
DOI: |
10.1186/s40494-024-01470-4 |
Popis: |
Abstract Archaeological illustration is a graphic recording technique that delineates the shape, structure, and ornamentation of cultural artifacts using lines, serving as vital material in archaeological work and scholarly research. Aiming at the problems of low line accuracy in the results of current mainstream image generation algorithms and interference caused by severe mural damage, this paper proposes a mural archaeological illustration generation algorithm based on multi-branch feature cross fusion (U2FGAN). The algorithm optimizes skip connections in U2Net through a channel attention mechanism, constructing a multi-branch generator consisting of a line extractor and an edge detector, which separately identify line features and edge information in artifact images before fusing them to generate accurate, high-resolution illustrations. Additionally, a multi-scale conditional discriminator is incorporated to guide the generator in outputting high-quality illustrations with clear details and intact structures. Experiments conducted on the Dunhuang mural illustration datasets demonstrate that compared to mainstream counterparts, U2FGAN reduced the Mean Absolute Error (MAE) by 10.8% to 26.2%, while also showing substantial improvements in Precision (by 9.8% to 32.3%), Fβ-Score (by 5.1% to 32%), and PSNR (by 0.4 to 2.2 dB). The experimental results show that the proposed method outperforms other mainstream algorithms in archaeological illustration generation. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|