A popular approach in data analysis is to represent a dataset in a high-dimensional feature space, and reduce a given task to a geometric computational problem. However, most of the classic geometric algorithms scale poorly as the dimension grows and are typically not applicable to the high-dimensional regime. This necessitates the development of new algorithmic approaches that overcome this
curse of dimensionality. In this mini-course I will give an overview of recent developments in this area including new algorithms for dimension reduction, sketching, and nearest neighbor search. We will discuss both theoretical results and implementation challenges.
Курс был прочитан в рамках International Computer Science Student School Recent Advances in Algorithms, May 22–26, 2017
|Date and time||Class|Name||Venue|short||Materials|
|Lecture 1: Introduction and Measure Concentration, Lecture||ПОМИ РАН||slides, video|
|Lecture 2: Dimension Reduction, Lecture||ПОМИ РАН||slides, video|
|Lecture 3: Theory of Nearest Neighbor Search, Lecture||ПОМИ РАН||slides, video|
|Lecture 4: Practice of Nearest Neighbor Search, Lecture||ПОМИ РАН||slides, video|