Reusing Annotation Labor for Concept Selection

Robin Aly, Djoerd Hiemstra, A.P. de Vries

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

5 Citations (Scopus)
50 Downloads (Pure)


Describing shots through the occurrence of semantic concepts is the first step towards modeling the content of a video semantically. An important challenge is to automatically select the right concepts for a given information need. For example, systems should be able to decide whether the concept ``Outdoor'' should be included into a search for ``Street Basketball''. In this paper we provide an innovative method to automatically select concepts for an information need. To achieve this, we provide an estimation for the occurrence probability of a concept in relevant shots, which helps us to quantify the helpfulness of a concept. Our method re-uses existing training data which is annotated with concept occurrences to build a text collection. Searching in this collection with a text retrieval system and knowing about the concept occurrences allows us to come up with a good estimate for this probability. We evaluate our method against a concept selection benchmark and search runs on both the TRECVID 2005 and 2007 collections. These experiments show that the estimation consistently improves retrieval effectiveness.
Original languageUndefined
Title of host publicationProceedings of the 8th ACM International Conference on Image and Video Retrieval (CIVR '09)
Place of PublicationNew York
PublisherAssociation for Computing Machinery (ACM)
Number of pages8
ISBN (Print)978-1-60558-480-5
Publication statusPublished - Jun 2009

Publication series



  • EWI-15456
  • IR-68558
  • METIS-264405
  • CR-H.3.3

Cite this

Aly, R., Hiemstra, D., & de Vries, A. P. (2009). Reusing Annotation Labor for Concept Selection. In Proceedings of the 8th ACM International Conference on Image and Video Retrieval (CIVR '09) (pp. 44:1-44:8). [10.1145/1646396.1646448] New York: Association for Computing Machinery (ACM).