Skip to main content Skip to navigation

Events

Select tags to filter on
Previous events   More events Jump to any date

Search calendar

Enter a search term into the box below to search for all events matching those terms.

Start typing a search term to generate results.

How do I use this calendar?

You can click on an event to display further information about it.

The toolbar above the calendar has buttons to view different events. Use the left and right arrow icons to view events in the past and future. The button inbetween returns you to today's view. The button to the right of this shows a mini-calendar to let you quickly jump to any date.

The dropdown box on the right allows you to see a different view of the calendar, such as an agenda or a termly view.

If this calendar has tags, you can use the labelled checkboxes at the top of the page to select just the tags you wish to view, and then click "Show selected". The calendar will be redisplayed with just the events related to these tags, making it easier to find what you're looking for.

 
Tue 7 Feb, '17
-
YRM
Common Room (C0.06)
Thu 9 Feb, '17
-
Neuroimaging Statistics Reading Group
C1.06
Thu 9 Feb, '17
-
WCC Meeting
C0.08
Fri 10 Feb, '17
-
Management Group
C1.06
Fri 10 Feb, '17
-
Algorithms Seminar
C1.06
Fri 10 Feb, '17
-
OxWaSP mini-symposium
F1.07

2pm-3,30

Speaker: Richard Nickl (University of Cambridge)

Title: ‘On Bayes solutions of some nonlinear inverse problems’

Abstract: Bayesian methodology has recently become popular in inverse problems as algorithmic advances have made approximate computation of the posterior distribution viable — see the work by Andrew Stuart, Gareth Roberts and co-authors. A natural question to ask is whether the Bayesian algorithm `works’ in the sense that posterior-based inference gives an optimal solution of the inverse problem that is `objective' (independent of the prior). Interesting inverse problems typically lead to infinite-dimensional parameter spaces and thus this question cannot be answered by classical parametric theory, but has to be tackled by tools from Bayesian Non-parametrics. We will discuss recent results that establish the frequentist optimality of Bayesian solutions of a certain class of nonlinear inverse problems that arise when one observes a continuous-time stochastic process, such as a diffusion of Levy process, at finitely many discrete times. We will address both contraction theorems as well as `limiting shape’ results such as Bernstein von Mises theorems for posterior distributions.’

3:30-4:30

Speaker: Michael Goldstein (Durham University)

Title: Bayes linear uncertainty analysis for complex physical systems modelled by computer simulators

Abstract: Most large and complex physical systems are studied by mathematical models, implemented as high dimensional computer simulators. While all such cases differ in physical description, each analysis of a physical system based on a computer simulator involves the same underlying sources of uncertainty. There is a growing field of study which aims to quantify and synthesise all of the uncertainties involved in relating models to physical systems, within the framework of Bayesian statistics, and to use the resultant uncertainty specification to address problems of forecasting and decision making. This talk will give an overview of aspects of this emerging methodology, with illustrations from current areas of application.

Fri 10 Feb, '17
-
SF@W
A1.01
Mon 13 Feb, '17
-
Machine Learning Reading Group. Room A1.01
Tue 14 Feb, '17
-
YRM
Common Room (C0.06)
Wed 15 Feb, '17
-
SSLC
C1.06
Thu 16 Feb, '17
-
Neuroimaging Statistics Reading Group
C1.06
Thu 16 Feb, '17
-
Warwick R User Group
Common Room (C0.06)
Fri 17 Feb, '17
-
Algorithms Seminar
C1.06
Fri 17 Feb, '17
-
SF@W
A1.01
Fri 17 Feb, '17
-
CRiSM Seminar
MA_B1.01

Ioannis Kosmidis

Title:
Reduced-bias inference for regression models with tractable and
intractable likelihoods

Abstract:

This talk focuses on a unified theoretical and algorithmic framework
for reducing bias in the estimation of statistical models from a
practitioners point of view. We will briefly discuss how shortcomings
of classical estimators and of inferential procedures depending on
those can be overcome via reduction of bias, and provide a few
demonstrations stemming from current and past research on well-used
statistical models with tractable likelihoods, including beta
regression for bounded-domain responses, and the typically
small-sample setting of meta-analysis and meta-regression in the
presence of heterogeneity. The large impact that bias in the
estimation of the variance components can have on inference motivates
delivering higher-order corrective methods for generalised linear
mixed models. The challenges in doing that will be presented along
with resolutions stemming from current research.
Mon 20 Feb, '17
-
Machine Learning Reading Group. Room A1.01
Tue 21 Feb, '17
-
YRM
Common Room (C0.06)
Wed 22 Feb, '17
-
Teaching Committee
C1.06
Wed 22 Feb, '17
-
ST404 Poster Session
C0.08
Thu 23 Feb, '17
-
Neuroimaging Statistics Reading Group
C1.06
Fri 24 Feb, '17
-
Management Group
C1.06
Fri 24 Feb, '17
-
Algorithms Seminar
C1.06
Fri 24 Feb, '17
-
OxWaSP mini-symposium
F1.07

Dr Shaun Seaman (Medical Research Council Biostatistics Unit)

Relative Efficiency of Joint-Model and Full-Conditional-Specification Multiple Imputation when Conditional Models are Compatible: the General Location Model.

Abstract: Fitting a regression model of interest is often complicated by missing data on the variables in that model. Multiple imputation (MI) is commonly used to handle these missing data. Two popular methods of MI are joint model MI and full-conditional-specification (FCS) MI. These are known to yield imputed data with the same asymptotic distribution when the conditional models of FCS are compatible with the joint model. We show that this asymptotic equivalence of imputation distributions does not imply that joint model MI and FCS MI will also yield asymptotically equally efficient inference about the parameters of the model of interest, nor that they will be equally robust to misspecification of the joint model. When the conditional models used by FCS MI are linear, logistic and multinomial regressions, these are compatible with a restricted general location (RGL) joint model. We show that MI using the RGL joint model (RGL MI) can be substantially more asymptotically efficient than FCS MI, but this typically requires very strong associations between variables. When associations are weaker, the efficiency gain is small. Moreover, FCS MI is shown to be potentially much more robust than RGL MI to misspecification of the RGL model when there is substantial missingness in the outcome variable. This is joint work with Dr Rachael Hughes, University of Bristol.

Fri 24 Feb, '17
-
SF@W
A1.01
Fri 24 Feb, '17
-
ST404 Poster Session
C0.08
Mon 27 Feb, '17
-
Machine Learning Reading Group. Room A1.01
Tue 28 Feb, '17
-
WCC Meeting
C1.06
Tue 28 Feb, '17
-
IT Committee Mtg
C0.08
Tue 28 Feb, '17
-
YRM
Common Room (C0.06)
Wed 1 Mar, '17
-
MPTS
A1.01 then B3.02

Refreshments in the Common Room at 15.30

Placeholder