A popular approach in data analysis is to represent a dataset in a high-dimensional feature space, and reduce a given task to a geometric computational problem. However, most of the classic geometric algorithms scale poorly as the dimension grows and are typically not applicable to the high-dimensional regime. This necessitates the development of new algorithmic approaches that overcome this
curse of dimensionality. In this mini-course I will give an overview of recent developments in this area including new algorithms for dimension reduction, sketching, and nearest neighbor search. We will discuss both theoretical results and implementation challenges.
Курс был прочитан в рамках International Computer Science Student School Recent Advances in Algorithms, May 22–26, 2017
|Дата и время||Название||Место||Материалы|
|Lecture 1: Introduction and Measure Concentration, лекция||ПОМИ РАН||слайды, видео|
|Lecture 2: Dimension Reduction, лекция||ПОМИ РАН||слайды, видео|
|Lecture 3: Theory of Nearest Neighbor Search, лекция||ПОМИ РАН||слайды, видео|
|Lecture 4: Practice of Nearest Neighbor Search, лекция||ПОМИ РАН||слайды, видео|