Taming Data Explosion in Probabilistic Information Integration

Ander de Keijzer, Maurice van Keulen, Yiping Li

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

39 Downloads (Pure)

Abstract

Data integration has been a challenging problem for decades. In autonomous data integration, i.e., without a user to solve semantic uncertainty and conflicts between data sources, it even becomes a serious bottleneck. A probabilistic approach seems promising as it does not require extensive semantic annotations nor user interaction at integration time. It simply teaches the application how to generically cope with uncertainty. Unfortunately, without any world knowledge, uncertainty abounds as almost everything becomes (theoretically) possible and maintaining all possibilities produces huge volumes of data. In this paper, we claim that simple and generic knowledge rules are sufficient to drastically reduce uncertainty, hence tame data explosion to a manageable size.
Original languageEnglish
Title of host publicationPre-Proceedings of the International Workshop on Inconsistency and Incompleteness in Databases (IIDB 2006)
PublisherUniversity of Mons-Hainaut, Belgium
Pages82-86
Number of pages5
Publication statusPublished - 26 Mar 2006
EventPre-International Workshop on Inconsistency and Incompleteness in Databases (IIDB 2006) - Munich, Germany
Duration: 26 Mar 200626 Mar 2006

Workshop

WorkshopPre-International Workshop on Inconsistency and Incompleteness in Databases (IIDB 2006)
Period26/03/0626/03/06
Other26 Mar 2006

Keywords

  • DB-SDI: SCHEMA AND DATA INTEGRATION

Fingerprint

Dive into the research topics of 'Taming Data Explosion in Probabilistic Information Integration'. Together they form a unique fingerprint.

Cite this