# Matthew Thorpe

I am now at Carnegie Mellon University and can be found here.

I am a third year PhD student studying on the MASDOC program and sponsored by Selex ES. I obtained my undergraduate degree from Warwick and I also hold a masters degree from the University of New South Wales, Sydney. In January to December 2014 I was President of the Warwick chapter of SIAM. See here for details on the Warwick chapter and here for details on SIAM. Last year I was also teaching assistant for Analysis III in Term 1 and Differentiation in Term 2. This year I am the Teaching Assistant for Analysis I in Term 1 and Analysis II in Term 2. I have also written a blog for the Smith Institute promoting the use of variational methods in statistical inference problems, see here.

## Research

My supervisors are Florian Theil and Adam Johansen.

My research is in multi-scale methods for statistical inference. Whilst the 1-D filtering problem has an analytical solution in the case of linearity and Gaussian noise difficulties arise when trying to generalise to higher dimensions. The difficulty lies in the data association problem. Once it is known which observations correspond to which tracks (and how many tracks there are) then the Kalman filter will give the best solution (under conditions) as the unbiased estimator with minimum mean square error. Traditional methods to solve this problem generally involve taking all possible combinations of observations to tracks and performing a Kalman filter on each, known as the Multiple Hypothesis Tracker. This is NP-complete and so quickly grows into a computationally intensive calculation. Approximations generally work on throwing out unlikely data association hypothesis but these run the risk of discarding correct hypotheses which normally cannot be recovered.

My recent work concerns the consistency of a generalised version of the k-means method. In particular we can show that the k-means method applied to data association - smoothing problems will converge in the data rich limit.

My current work is studying the convergence of estimators motivated by machine learning algorithms. In particular I look at minimization problems on graphs with n vertices, and take n to infinity.

My mathematical background is in differential equations, functional analysis and fluid dynamics.

## Conferences/Workshops Attended

Nov 2012, Presented a poster at the KTN Showcase, Oxford
Dec 2012, Attended workshop on Statistical Computing and Statistical Inference at the APTS Week 1, Cambridge
Mar 2013, Gave a talk on Multi Target Tracking in a GSM Network at the CCA-MASDOC Conference, Warwick
Mar 2013, Presented a poster at the Stochastic and Statistical Models at the Interface of Modern Industry and Mathematical Sciences conference, Cambridge
Jan 2014, Gave a talk on Convergence of the k-Means Method at the EQUIP Seminar, Warwick
Apr 2014, Attended conference on Atomistic to Continuum Models, L'Aquila, Italy
May 2014, Gave a talk on Convergence of the k-Means Method at the MASDOC retreat, Warwick

## Publications

[4] M. Thorpe and A. Johansen, Weak Convergence for Generalized Spline Smoothing, final version in preparation.
[3] M. Thorpe and A. Johansen, Rate of Convergence for a Smoothing Spline with Data Association Model, final version in preparation.
[2] A. Gkiokas, A. Cristea and M. Thorpe, Self-reinforced Meta Learning for Belief Generation, Research and Development in Intelligent Systems XXXI, Springer International Publishing, 2014. Pdf link here.

[1] M. Thorpe, F. Theil, A. Johansen and N. Cade, Convergence of the k-Means Minimization Problem Using Gamma-Convergence, submitted to the SIAM Journal on Applied Mathematics, 2015. Arxiv link here.

## Contact

I can be emailed at mthorpe [at] andrew [dot] cmu [dot] edu.