# Visiting Lecturer Program (272)

### Published at: 2017-04-30

Speaker: **Dr. Mohammad R. Salavatipour**

Associate Professor

Department of Computing Science

University of Alberta, Edmonton, Canada

Title: **Ultimate Approximation Algorithm for k-means clustering**

Local Organizer: Dr. Mehdi Rasti

Time: Monday, May 1, 2017, 12:30- 13:30

Location: Department of Computer Engineering and Information Technology, Amirkabir University of Technology, Tehran, Iran

Abstract:

The most well known and ubiquitous clustering problem encountered in nearly every branch of science is undoubtedly k-means: given a set of data points and a parameter k, select k centres and partition the data points into k clusters around these centres so that the sum of squares of distances of the points to their cluster centre is minimized. Typically these data points lie in Euclidean space. k-means and the first algorithms for it were introduced in the 1950’s. Over the last six decades, hundreds of papers have studied this problem and different algorithms have been proposed for it. The most commonly used algorithm in practice is known as Lloyd-Forgy, which is also referred to as the” k-means algorithm, and various extensions of it often work very well in practice. However, they may produce solutions whose cost is arbitrarily large compared to the optimum solution. Kanungo et al. [2004] analyzed a very simple local search heuristic to get a polynomial-time algorithm with approximation ratio 9. Finding an algorithm with a better worst-case approximation guarantee has remained one of the biggest open questions in this area, in particular whether one can get a true PTAS for fxed dimension Euclidean space. We settle this problem by showing that a simple local search algorithm provides a Polynomial Time Approximation Scheme for k-means for any fixed dimension Euclidean metric. Our analysis extends very easily to the more general settings where we want to minimize the sum of q’th powers of the distances between data points.