Super-Fine Attributes with Crowd Prototyping
Autor: | Mark S. Nixon, John N. Carter, Daniel Martinho-Corbishley |
---|---|
Rok vydání: | 2019 |
Předmět: |
Matching (statistics)
Computer science business.industry Applied Mathematics 02 engineering and technology Crowdsourcing Machine learning computer.software_genre Facial recognition system Visualization Identification (information) Computational Theory and Mathematics Ranking Artificial Intelligence Salient 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Computer Vision and Pattern Recognition Artificial intelligence business computer Image retrieval Software Semantic gap |
Zdroj: | IEEE Transactions on Pattern Analysis and Machine Intelligence. 41:1486-1500 |
ISSN: | 1939-3539 0162-8828 |
DOI: | 10.1109/tpami.2018.2836900 |
Popis: | Recognising human attributes from surveillance footage is widely studied for attribute-based re-identification. However, most works assume coarse, expertly-defined categories, ineffective in describing challenging images. Such brittle representations are limited in descriminitive power and hamper the efficacy of learnt estimators. We aim to discover more relevant and precise subject descriptions, improving image retrieval and closing the semantic gap. Inspired by fine-grained and relative attributes, we introduce super-fine attributes, which now describe multiple, integral concepts of a single trait as multi-dimensional perceptual coordinates. Crowd prototyping facilitates efficient crowdsourcing of super-fine labels by pre-discovering salient perceptual concepts for prototype matching. We re-annotate gender, age and ethnicity traits from PETA, a highly diverse (19K instances, 8.7K identities) amalgamation of 10 re-id datasets including VIPER, CUHK and TownCentre. Employing joint attribute regression with the ResNet-152 CNN, we demonstrate substantially improved ranked retrieval performance with super-fine attributes in comparison to conventional binary labels, reporting up to a 11.2 and 14.8 percent mAP improvement for gender and age, further surpassed by ethnicity. We also find our 3 super-fine traits to outperform 35 binary attributes by 6.5 percent mAP for subject retrieval in a challenging zero-shot identification scenario. |
Databáze: | OpenAIRE |
Externí odkaz: |