# Statistics Seminar

**"All of Statistics"**, unashamedly stealing the famous book-title of Larry Wasserman to describe this seminar series as we intend to bring together everyone working in statistics with different hats on your head (statistician, mathematician, probabilist, machine learner etc.). -- Organisers.

**Time:** Runs on **every other Monday** 13:00-14:00 during term time. (In between weeks we will have Lunch time seminars during lunch with informal talks)

**Lunch:** Free lunch **every Monday** (12:00-13:00) in Statistics common room on the 1st floor of Mathematical Sciences Building.

**Venue:** Statistics common room at Mathematical Sciences Building (in-person only)

**Inaugural talk on 23rd October** would be given by our very own Professor Gareth Roberts.

**Title:** Bayesian Fusion

**Abstract:** Suppose we can readily access samples from but we wish to obtain samples from . The so-called Bayesian Fusion problem comes up within various areas of modern Bayesian Machine Learning, for example in the context of big data or privacy constraints, as well as more traditional statistical areas such as meta-analysis. Many approximate solutions to this problem have been proposed. However this talk will present an exact solution based on rejection sampling in an extended state space, where the accept/reject decision is carried out by simulating the skeleton of a suitably constructed auxiliary collection of Brownian bridges.

**On 6th November**we would have Professor Johannes Schmidt-Hieber visiting us from

**Title:**

**Statistical learning in biological neural networks****Abstract:**Compared to artificial neural networks (ANNs), the brain learns faster, generalizes better to new situations and consumes much less energy. ANNs are motivated by the functioning of the brain, but differ in several crucial aspects. For instance, ANNs are deterministic while biological neural networks (BNNs) are stochastic. Moreover, it is biologically implausible that the learning of the brain is based on gradient descent. In this talk we look at biological neural networks as a statistical method for supervised learning. We relate the local updating rule of the connection parameters in BNNs to a zero-order optimization method and derive some first statistical risk bounds.

**On 20th November**the talk would be given by our very own Professor Jim Smith.

**Title: Graphical Models of Intelligent Cause**

**Abstract:**Graphical models are now widely used to express underlying mechanisms which drive and explain how such mechanisms work. In particular Bayesian Networks and more recently Chain Event Graphs have been used to produce probabilistic predictive models of processes. Such graphs are chosen to be consistent with elicited natural explanations of how and why things happen the way they do in a given domain. Causal algebras are then specified which use this elicited information to determine predictions of what might happen were the system be subjected to various controls.

But how could we extend this work so that it might apply to produce predictive models of what might happen when the decision maker believes that his controls might be resisted? In this talk I will argue that standard causal models then need to be generalised to embed a decision maker's beliefs of the intent capability and the information a resistant adversary might have about the intervention after it has been made. After reviewing recent advances in general forms of Bayesian dynamic causal models I will describe how - using a special form of Adversarial Risk Analysis - we are developing new intelligent algorithms to produce such predictions. The talk will be illustrated throughout by examples of various adversarial threats currently being analysed within the UK.

**On 4th December**we would have Professor Geoff Nicholls visiting us from the University of Oxford.

**Title:**Partial order models for rank data

**Abstract:**In rank-order data assessors give preference orders over choice sets. These can be thought of as permutations of the choice sets ordered by preference from best to worst. We call these permutations ``lists’’. Well known parametric models for list-data include the Mallows model and the Plackett-Luce model. These models seek a total order which is ``central’’ to the lists provided by the assessors. Extensions model the list-data as realisations of a mixture of distributions each centred on a total order. We give a model for list-data which is centred on a partial order. We give a prior over partial orders with several nice properties and explain how to carry out Bayesian inference for the unknown true partial order constraining the list-data. Model comparison favours the partial order model in all data sets we have looked at so far. However, evaluation of the likelihood costs #P. We give a model which admits scalable inference and a timeseries model for evolving partial orders. The timeseries model was motivated by queue-data informing an evolving social hierarchy, which we model as an evolving partial order.

(This is joint work with Kate Lee, Jessie Jiang, Nicholas Karn, David Johnson, Alexis Muir-Watt and Rukuang Huang.)

**On 8th January**we would have Professor Fabrizio Leisen visiting us from the University of Nottingham.

**Title:** TBD

**Abstract: **TBD

**On 22nd January** the talk would be given by our very own Professor Anastasia Papavasiliou.

**Title:** TBD

**Abstract:** TBD

**On 19th February** we would have Professor Terry Lyons visiting us from the University of Oxford.

**Title:** TBD

**Abstract:** TBD

**On 4th March** we would have Dr. Fanghui Liu from the Computer Science department of University of Warwick.

**Title:** TBD

**Abstract:** TBD.

## Organisers

Ritabrata DuttaLink opens in a new window

Previously run as regular seminar series of CRiSM.