Collecting and Annotating the Large Continuous Action Dataset
Autor: | Ran Xu, Jeffrey Mark Siskind, Daniel Paul Barrett, Haonan Yu |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2015 |
Předmět: |
FOS: Computer and information sciences
Computer science Computer Vision and Pattern Recognition (cs.CV) Computer Science - Computer Vision and Pattern Recognition 02 engineering and technology Space (commercial competition) Annotation 0502 economics and business 0202 electrical engineering electronic engineering information engineering Baseline (configuration management) business.industry 05 social sciences Replicate Class (biology) Computer Science Applications ComputingMethodologies_PATTERNRECOGNITION Action (philosophy) Hardware and Architecture Pattern recognition (psychology) Action recognition 020201 artificial intelligence & image processing Computer Vision and Pattern Recognition Artificial intelligence business 050203 business & management Software |
Popis: | We make available to the community a new dataset to support action recognition research. This dataset is different from prior datasets in several key ways. It is significantly larger. It contains streaming video with long segments containing multiple action occurrences that often overlap in space and/or time. All actions were filmed in the same collection of backgrounds so that background gives little clue as to action class. We had five humans to replicate the annotation of temporal extent of action occurrences labeled with their classes and measured a surprisingly low level of intercoder agreement. Baseline experiments show that recent state-of-the-art methods perform poorly on this dataset. This suggests that this will be a challenging dataset to foster advances in action recognition research. This manuscript serves to describe the novel content and characteristics of the LCA dataset, present the design decisions made when filming the dataset, document the novel methods employed to annotate the dataset, and present the results of our baseline experiments. |
Databáze: | OpenAIRE |
Externí odkaz: |