Classifier Hypothesis Generation Using Visual Analysis Methods

Christin Seifert, Vedran Sabol, Michael Granitzer

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

7 Citations (Scopus)

Abstract

Classifiers can be used to automatically dispatch the abundance of\nnewly created documents to recipients interested in particular topics.\nIdentification of adequate training examples is essential for classification\nperformance, but it may prove to be a challenging task in large document\nrepositories. We propose a classifier hypothesis generation method\nrelying on automated analysis and information visualisation. In our\napproach visualisations are used to explore the document sets and\nto inspect the results of machine learning methods, allowing the\nuser to assess the classifier performance and adapt the classifier\nby gradually refining the training set.
Original languageEnglish
Title of host publicationNetworked Digital Technologies
Subtitle of host publicationSecond International Conference, NDT 2010, Prague, Czech Republic, July 7-9, 2010. Proceedings
EditorsFilip Zavoral, Jakub Yaghob, Pit Pichappan, Eyas El-Qawasmeh
PublisherSpringer
Pages98-111
Number of pages14
ISBN (Electronic)978-3-642-14292-5
ISBN (Print)978-3-642-14291-8
DOIs
Publication statusPublished - 2010
Externally publishedYes
Event2nd International Conference on Networked Digital Technologies, NDT 2010 - Prague, Czech Republic
Duration: 7 Jul 20109 Jul 2010
Conference number: 2

Publication series

NameCommunications in Computer and Information Science
PublisherSpringer
Volume87
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Conference

Conference2nd International Conference on Networked Digital Technologies, NDT 2010
Abbreviated titleNDT
Country/TerritoryCzech Republic
CityPrague
Period7/07/109/07/10

Keywords

  • Text Categorisation
  • Visual Analysis

Fingerprint

Dive into the research topics of 'Classifier Hypothesis Generation Using Visual Analysis Methods'. Together they form a unique fingerprint.

Cite this