The INI has a new website!

This is a legacy webpage. Please visit the new site to ensure you are seeing up to date information.

Skip to content



Some thoughts on nonparametric classification: nearest neighbours, bagging and max likelihood estimation of shape-constrained densities

Samworth, R (Cambridge)
Tuesday 08 January 2008, 10:00-11:00

Seminar Room 1, Newton Institute


The $k$-nearest neighbour rule is arguably the simplest and most intuitively appealing nonparametric classifier. We will discuss recent results on the optimal choice of $k$ in situations where the underlying populations have densities with a certain smoothness in $\mathbb{R}^d$. Extensions to the bagged nearest neighbour classifier, which can be regarded as a weighted $k$-nearest neighbour classifier, are also possible, and yield a somewhat suprising comparsion with the unweighted case.

Another possibility for nonparametric classification is based on estimating the underlying densities explicitly. An attractive alternative to kernel methods is based on the maximum likelihood estimator, which can be shown to exist if the densities satisfy certain shape constraints, such as log-concavity. We will also discuss an algorithm for computing the estimator in this case, which results in a classifier that is fully automatic yet still nonparametric.

Related Links


[pdf ]




The video for this talk should appear here if JavaScript is enabled.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.

Back to top ∧