Abstract
Many algorithms perform very well in practice, but have a poor worst-case performance. The reason for this discrepancy is that worst-case analysis is often a way too pessimistic measure for the performance of an algorithm. In order to provide a more realistic performance measure that can explain the practical performance of algorithms, smoothed analysis has been introduced. It is a hybrid of the classical worst-case analysis and average-case analysis, where the performance on inputs is measured that are subject to random noise. We give a gentle, not too formal introduction to smoothed analysis by means of two examples: the k-means method for clustering and the Nemhauser/Ullmann algorithm for the knapsack problem.
Original language | Undefined |
---|---|
Pages (from-to) | 280-286 |
Number of pages | 7 |
Journal | IT - Information Technology |
Volume | 53 |
Issue number | 6 |
DOIs | |
Publication status | Published - Dec 2011 |
Keywords
- Probabilistic Analysis
- EWI-20672
- IR-78736
- Smoothed Analysis
- METIS-281540
- Algorithms