Beyond Worst-Case Analysis of Algorithms: Smoothed Analysis Chapter

Ινστιτούτο
Corelab, ECE NTUA
Ομιλητής
Mανώλης Βλατάκης
Ημέρα
13-01-2020, 17:00
Μέρος
1.1.31, Σχολή Ηλεκτρολόγων Μηχανικών και Μηχανικών Υπολογιστών, ΕΜΠ (παλιό κτίριο)
Σύνοψη

My experiences also strongly confirmed my previous opinion that
the best theory is inspired by practice and the best practice is inspired by theory.

Donald E. Knuth: "Theory and Practice", 1991

Comparing different algorithms is hard. Even in the archetypical task of sorting different measures of algorithm performance leads to different winners. For example, the insertion sort algorithm is faster than merge sort on almost already-sorted arrays but slower on many other inputs.

Merge sort, with its worst-case asymptotic running time of nlogn for arrays of length n is better in this sense than insertion sort or quick sort, which has a worst-case running time of n^2.
Traditionally, the complexity of an algorithm is measured by its worst-case performance. If a single input instance triggers an exponential run time, the algorithm is called an exponential-time algorithm.

Worst-case Analysis is always fair?

Many algorithms and heuristics work well on real data, despite having poor complexity under the standard worst-case measure. In this tutorial we will discuss in a high level about the solution concept of Smoothed Analysis of the Complexity of an Algorithm.*

In this tutorial, we will try to explore in a high level the lion's share of the classical and modern applications of smoothed analysis for problems ranging from mathematical programming, numerical analysis, machine learning, and data mining.

* Spielman and Teng for their seminal work in "Smoothed analysis of algorithms" won the Godel-prize award in 2001