Imalytics preclinical: Interactive analysis of biomedical volume data

Felix Gremse, Marius Stärk, Josef Ehling, Jan Robert Menzel, Twan Lammers, Fabian Kiessling

Research output: Contribution to journalArticleAcademicpeer-review

87 Citations (Scopus)
117 Downloads (Pure)


A software tool is presented for interactive segmentation of volumetric medical data sets. To allow interactive processing of large data sets, segmentation operations, and rendering are GPU-accelerated. Special adjustments are provided to overcome GPU-imposed constraints such as limited memory and host-device bandwidth. A general and efficient undo/redo mechanism is implemented using GPU-accelerated compression of the multiclass segmentation state. A broadly applicable set of interactive segmentation operations is provided which can be combined to solve the quantification task of many types of imaging studies. A fully GPU-accelerated ray casting method for multiclass segmentation rendering is implemented which is well-balanced with respect to delay, frame rate, worst-case memory consumption, scalability, and image quality. Performance of segmentation operations and rendering are measured using high-resolution example data sets showing that GPU-acceleration greatly improves the performance. Compared to a reference marching cubes implementation, the rendering was found to be superior with respect to rendering delay and worst-case memory consumption while providing sufficiently high frame rates for interactive visualization and comparable image quality. The fast interactive segmentation operations and the accurate rendering make our tool particularly suitable for efficient analysis of multimodal image data sets which arise in large amounts in preclinical imaging studies.
Original languageEnglish
Pages (from-to)328-341
Issue number3
Publication statusPublished - 2016


  • IR-103678
  • METIS-321076


Dive into the research topics of 'Imalytics preclinical: Interactive analysis of biomedical volume data'. Together they form a unique fingerprint.

Cite this