Skip to main content Skip to navigation


SPAAM Seminar Series

The Statistics, Probability, Analysis and Applied Mathematics (SPAAM) seminar series will take place on Thursdays between 3-4pm in room B3.02 and virtually on the SPAAM Microsoft Teams channel. It will host a variety of talks from PhD students involved in applied mathematics research at Warwick and invited guests from other institutions (see the bottom of this page for the talk abstracts!).

Each seminar will usually host two speakers (unless otherwise stated) with each talk taking around 15-20 minutes with 5-10 minutes of questions afterwards. Speakers and committee members will hang around for some time after the talks for social tea/coffee and further questions. Please do contact one of the committee if you would like to join and be added to the MS Teams channel. Note that these talks may be recorded for later viewing on our Youtube channel so do join with audio and video off if you don't wish to feature!

If you would like to give a talk this term, please contact Matthew Coates ( or Swetha Usha Lal ( and we will find you a slot!

Term 1
Date Talk 1 Talk 2
21st October 2021 Yiming Ma (MathSys) Jimmy McKendrick (MathSys)
28th October 2021 Rafal Szlendak (Maths)
4th November 2021 Social event
11th November 2021 Jack Bara (MathSys) Charlie Pilgrim (MathSys)
18th November 2021 Matthew Coates (MASDOC)
25th November 2021 Giorgos Vasdekis (Stats)
2nd December 2021 Social event
9th December 2021 David Itkin (Carnegie Mellon) Matthew King (MASDOC)

Week 3 (Talk 1) - Introduction to MobileNets - Yiming Ma (MathSys)

Convolutional neural networks have been prosperous for years in the field of computer vision. However, most of them achieve better performance by leveraging extremely complex architectures. Although leaderboards might have been swept by hundreds of this type of models, none of them has real application values – they cannot be implemented in mobile devices due to their gigantic sizes. Thus, studying efficient models is of crucial importance. In this talk, the family of MobileNets, which are both accurate and lightweight, will be introduced.

Week 3 (Talk 2) - Modelling the Seasonality in Lassa Fever Cases in Nigeria  - Jimmy McKendrick (MathSys)

Recent Lassa Fever epidemics in Nigeria have been following a seasonal pattern in cases. Using vector models, and Approximate Bayesian Computation I attempt to discern what the seasonal drivers of the disease are.

Week 4 - Permutation Compressors for Provably Faster Distributed Nonconvex Optimisation - Rafal Szlendak (Maths)

We study the MARINA method of Goerbunov et al., (2020) – the currentstate-of-the-art distributed non-convex optimization method in terms of theoretical communication complexity. Theoretical superiority of this method canbe largely attributed to two sources: the use of a carefully engineered biased stochastic gradient estimator, which leads to a reduction in the number of communication rounds, and the reliance on independent stochastic communication compression operators, which leads to a reduction in the number of transmitted bits within each communication round. In this paper we i) extend the theory of MARINA to support a much wider class of potentially correlated compressors, extending the reach of the method beyond the classical independent compressors setting, ii) show that a new quantity, for which we coin the name Hessian variance, allows us to significantly refine the original analysis of MARINA without any additional assumptions, and iii) identify a special class of correlated com-pressors based on the idea of random permutations, for which we coin the term PermK, the use of which leads to O(√n) (resp. O(1 +d/√n)) improvement in the theoretical communication complexity of MARINA in the low Hessian variance regime when d≥n (resp. d≤n), where n is the number of workers and d is the number of parameters describing the model we are learning. We corroborate our theoretical results with carefully engineered synthetic experiments with minimizing the average of nonconvex quadratics, and on autoencoder training with the MNIST dataset.

Week 6 (Talk 1) - Cooperation in Dynamic Networks - Jack Bara (MathSys)

To tackle large (global) issues such as climate change requires cooperation at multiple scales, from individuals choosing to recycle to international trade. When agents act negatively, they may be punished by mutual defection or unilaterally burning bridges. In my talk I will give some insights and results on cooperative games occurring on dynamic networks; namely what networks tend to form from the coevolutionary process and the importance of timescales.

Week 6 (Talk 2) - Information foraging in the attention economy drives the rising entropy of English - Charlie Pilgrim (MathSys)

Over the past 200 years there have been continual advances in communications technology, characterised by increasing ease of access to ever more abundant sources of information. In the face of this abundance, people choose which information to consume and which to ignore. Media producers need attention to survive and so must adapt to this selective pressure, creating a feedback process of co-evolution similar to that seen in many ecosystems. Here, we explore a model that describes this dynamic, and show how the model outcomes agree with empirical evidence of rising word entropy in English.

Week 7 - Constructing Reduced Order Models of the Lithium Ion Cell - Matthew Coates (MASDOC)

The modelling of rechargeable battery cells, and in particular lithium ion cells, is of increasing industrial importance due to increased demand for products like electric vehicles and increased need for systems for energy storage. Key to the efficient operation of such batteries is effective modelling, in particular there is a need for reduced order models, relatively simple models of the cell that can be used to perform calculations in real time. We review some methods of constructing such real time models, and discuss questions of accuracy and improvement.

Week 8 - Piecewise Deterministic Markov Chain Monte Carlo and the Speed Up Zig-Zag sampler - Giorgos Vasdekis (Stats)

Piecewise Deterministic Markov Processes (PDMPs) have recently drawn the attention of the Markov Chain Monte Carlo (MCMC) community. The main reason is that these processes have a natural notion of momentum, which sometimes leads to better exploration of the state space and faster mixing. In the first half of this talk, we will give an introduction on how one can use these processes in MCMC. We will introduce the state of the art PDMP algorithms and we will present some interesting properties they have that can make them useful tools in computational Bayesian statistics. In the second half of this talk, we will introduce a new PDMP algorithm called the Speed Up Zig-Zag sampler, we will study its properties and explain why it can be efficiently used to target heavy tailed distributions.

Week 10 (Talk 1) - Title TBC - David Itkin (Carnegie Mellon)


Week 10 (Talk 2) - Title TBC - Matthew King (MASDOC)