Primal and dual Bregman methods with application to optical nanoscopy

Christoph Brune*, Alex Sawatzky, Martin Burger

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

62 Citations (Scopus)
1 Downloads (Pure)

Abstract

Measurements in nanoscopic imaging suffer from blurring effects modeled with different point spread functions (PSF). Some apparatus even have PSFs that are locally dependent on phase shifts. Additionally, raw data are affected by Poisson noise resulting from laser sampling and "photon counts" in fluorescence microscopy. In these applications standard reconstruction methods (EM, filtered backprojection) deliver unsatisfactory and noisy results. Starting from a statistical modeling in terms of a MAP likelihood estimation we combine the iterative EM algorithm with total variation (TV) regularization techniques to make an efficient use of a-priori information. Typically, TV-based methods deliver reconstructed cartoon images suffering from contrast reduction. We propose extensions to EM-TV, based on Bregman iterations and primal and dual inverse scale space methods, in order to obtain improved imaging results by simultaneous contrast enhancement. Besides further generalizations of the primal and dual scale space methods in terms of general, convex variational regularization methods, we provide error estimates and convergence rates for exact and noisy data. We illustrate the performance of our techniques on synthetic and experimental biological data.

Original languageEnglish
Pages (from-to)211-229
Number of pages19
JournalInternational journal of computer vision
Volume92
Issue number2
DOIs
Publication statusPublished - 1 Nov 2011
Externally publishedYes

Keywords

  • Bregman distance
  • Duality
  • Error estimation
  • Image processing
  • Imaging
  • Inverse scale space
  • Poisson noise

Fingerprint

Dive into the research topics of 'Primal and dual Bregman methods with application to optical nanoscopy'. Together they form a unique fingerprint.

Cite this