ST329 Topics in Statistics
Lecturer(s): Professor Barbel FinkestadtRand, Professor Xavier Didelot, Dr Richard Everitt
Prerequisite(s): Either ST218/219 Mathematical Statistics A&B or ST220 Introduction to Mathematical Statistics
Content: Three selfcontained sets of ten lectures. For a description of the topics see below. Please note that the topics covered in this module may change from year to year.
Commitment: 3 x 10 lectures in term 2, plus 1 revision class per topic in term 3.
Assessment: 100% by 2hour examination.
Topic: A primer on applied nonparametric data analysis (with focus on hypothesis testing)
Lecturer: Professor Barbel Finkestadt
Aims: To consider problems in nonparametric statistics in which the possible distributions of the observations are not restricted to any specific parametric family. The focus will be on hypothesis testing problems that occur in many practical settings and are an important part of any statistician’s and/or data scientist’s toolbox for applied data analysis.
Objectives: Using data examples with reference to the relevant commands in R students will
 Review concepts of statistical testing (including a review of the classical parametric approach)
 Be able to apply nonparametric testing in:
 Onesample problems (the signtest, Fisher’s permutation principle, rank statistics, the Wilcoxon signed rank test, comparison between tests);
 Twosample testing problems with related samples (continuous variables, binary variables) and independent samples (continuous variables, categorial variables);
 csample problems (oneand two factorial variance analysis, KruskallWallistest, Jonckheeretest, Friedmantest).
Recommended: Familiarity with R programming. Strong interest in statistical applications.
Literature/Reading material: the material will be made available online and in written form while the course is being developed in 19/20. There exist a large number of good textbooks, either fully specialised on nonparametric statistics or containing chapters on nonparametric testing, on the market. Some recommendations will be made when the course starts in term 2 but interested students should feel free to browse in the library or online.
Topic: Hidden Markov Models
Lecturer: Dr Xavier Didelot
Aims: To introduce Hidden Markov Models as a powerful, popular and flexible statistical methodology of analysis for sequential data.
Objectives: By the end of this topics module, students should be able to:
 Define Hidden Markov Models
 Describe and implement the algorithms for computing the likelihood, estimating parameters, decoding and forecasting
 Select and test a Hidden Markov Model for a given sequential dataset
 Integrate Hidden Markov Models within a Bayesian analysis
 Describe typical applications of Hidden Markov Models, for example to speech recognition or genetic data analysis
Topic: Approximate Bayesian Computation
Lecturer: Dr Richard Everitt
Aim: To introduce the fundamental ideas of approximate Bayesian computation (ABC): a class of techniques for performing Bayesian inference for models that are “intractable”.
Background: When facing complex probabilistic models, it sometimes happens that the resulting distributions are intractable in the sense that the corresponding likelihoods cannot be computed for any given value of the parameter. Doing Bayesian analysis on such models then becomes a challenge since traditional techniques fail to apply. There are several ways of dealing with this intractability problem by Monte Carlo techniques and we will consider a collection of methods called approximate Bayesian computation (ABC) that replace computing the intractable likelihood by multiple simulations from this density. Examples in various application areas will be used, including from statistical ecology.
Objectives: By the end of this module, students should be able to:

 Describe the situations in which ABC methods are applicable.
 Understand how the likelihood in these situations may be estimated using Monte Carlo.
 Describe and use ABC techniques.
 Critically evaluate the accuracy of results obtained using ABC techniques.
Recommended: Familiarity with Bayesian inference is desirable.