Core ML discussions topics
Introduction to Approximate Inference
To be discussed on 05/03/18. Discussion leader: Peter Byfield. Although extremely useful and conceptually simple, a fully probabalistic approach to machine learning posses a number of computational challenges. One of them being the marginalisation over all the variables in the model except for the variables of interest. In this session Peter will discuss Variational Inference methods (and maybe will touch on exact Monte-Carlo inference methods) that help us overcome these challenges,.
Classifier Technology and the Illusion of Progress
To be discussed on 19/03/18. Discussion leader: Jevgenij Gamper. In this discussion, I will lead the review of a paper by David J. Hand (Imperial College). Abstract: A great many tools have been developed for supervised classification, ranging from early methods such as linear discriminant analysis through to modern developments such as neural networks and support vector machines. A large number of comparative studies have been conducted in attempts to establish the relative superiority of these methods. This paper argues that these comparisons often fail to take into account important aspects of real problems, so that the apparent superiority of more sophisticated methods may be something of an illusion. In particular, simple methods typically yield performance almost as good as more sophisticated methods, to the extent that the difference in performance may be swamped by other sources of uncertainty that generally are not considered in the classical supervised classification paradigm.
Dropout as Bayesian Approximation
To be discussed on 19/02/18. Discussion leader: Jev Gamper. In this discussion we will discover how Neural Networks relate to Gaussian Processes. In particular, when drawing the relationship between Gaussian Processes and Neural Network stochastic regularisation methods such as Drop out we discover that we can use the latter to obtain uncertainty within the predictions of the neural network. Using drop out for uncertainty estimation is extremely useful, as we do not have to use expensive probabalistic inference methods such Bayesian Neural Networks, and can use existing network architechtures and traditional optimisation methods instead.