CS416 Optimisation Methods and their Applications
CS416 15 CATS (7.5 ECTS) Term 2
Availability
Option  MEng CS and DM
Academic Aims
The aim is for students to learn the mathematical and algorithmic foundations underpinning optimization problems that are ubiquitous in applications in machine learning, data analysis and scientific computing.
At their core, many realworld problems involve optimisation problems that are complex and challenging, and they require rpincipled mathematical and algorithmic solutions. Companies such as Google and Facebook use powerful algorithmic primitives such as gradient decent to solce a host of challenging problems. These primitives in turn are based on a beautiful mathematical theory develped in the context of convex and discrete optimisation.
There are several interrelated aims of this module: (1) to expose students to optimisation methods that have found significant applications, (2) to develop a toolkit of mathematical methods that can be used to tackle realworld problems, and (3) to rigorously analyse these methods.
Learning Outcomes
 Learn mathematical tools from convex optimisation, discrete and combinatorial, and linear algebra.
 Learn optimisation methods that are widely used in applications.
 Understand the mathematical theory behind these optimisation methods.

Implement optimisation methods and apply them to realworld problems.
Content
Some of the following topics will be discussed. The selection of topics may vary from year to year.
 Convex optimisation methods and their applications. Gradient descent, the multiplicative weights update method, the FrankWolfe gradient descent algorithm.
 Discrete and combinatorial optimisation methods and their applications. Submodular functions, submodular minimisation via the allipsoid method and gradient descent. Submodular maximisation via the Greedy algorithm.
 Algorithms for linear algebra and their applications. The power method, singular value decompositions, principal component analysis, spectral clustering, least squares and linear regression.
Books
 Stephen Bpyd and Lieven Vandenberghe. Convex Optimization. Cambridge University Press.
 John Hopcroft and Ravi Kannan. Foundations of Data Science. (Upcoming book, a manuscript is freely available).
 David Woodruff. Sketching as a Tool for Numerical Linear Algebra. Foundations and Trends in Theoretical Computer Science, vol 10, issue 12 pp. 1157, 2014
Assessment
Two hour examination (70%), coursework (30%).
Teaching
2 onehour lectures plus 1 onehour seminar and 1 onehour laboratory session per week.