This research presents a processing workflow to automatically find damaged building areas in an urban context. The input data requirements are high-resolution multi-view images, acquired from airborne platform. The elevations are derived from a dense surface model generated with photogrammetric methods. With the principal objective of rapid response in emergency situations, two different processing roadmaps are proposed, semi-supervised and unsupervised. Both of them follow a two-step workflow of building detection and building health estimation. Optionally, cadastral layers may serve as a-priori knowledge on building location. The semi-supervised approach involves a data training step, while the unsupervised approach exploits the similarities and dissimilarities between sets of features calculated over the detected buildings. The change detection task is formulated as a classification task defined over a conditional random field. The algorithms are evaluated using two datasets (Vexcel and Midas cameras) and results are compared with ground truth data and specific metrics.