D4: Text-guided diffusion model-based domain adaptive data augmentation for vineyard shoot detection

Autor: Hirahara, Kentaro, Nakane, Chikahito, Ebisawa, Hajime, Kuroda, Tsuyoshi, Iwaki, Yohei, Utsumi, Tomoyoshi, Nomura, Yuichiro, Koike, Makoto, Mineno, Hiroshi
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: In an agricultural field, plant phenotyping using object detection models is gaining attention. However, collecting the training data necessary to create generic and high-precision models is extremely challenging due to the difficulty of annotation and the diversity of domains. Furthermore, it is difficult to transfer training data across different crops, and although machine learning models effective for specific environments, conditions, or crops have been developed, they cannot be widely applied in actual fields. In this study, we propose a generative data augmentation method (D4) for vineyard shoot detection. D4 uses a pre-trained text-guided diffusion model based on a large number of original images culled from video data collected by unmanned ground vehicles or other means, and a small number of annotated datasets. The proposed method generates new annotated images with background information adapted to the target domain while retaining annotation information necessary for object detection. In addition, D4 overcomes the lack of training data in agriculture, including the difficulty of annotation and diversity of domains. We confirmed that this generative data augmentation method improved the mean average precision by up to 28.65% for the BBox detection task and the average precision by up to 13.73% for the keypoint detection task for vineyard shoot detection. Our generative data augmentation method D4 is expected to simultaneously solve the cost and domain diversity issues of training data generation in agriculture and improve the generalization performance of detection models.
Databáze: arXiv