Skip to main content Skip to navigation

Events

Select tags to filter on
  More events Jump to any date

Search calendar

Enter a search term into the box below to search for all events matching those terms.

Start typing a search term to generate results.

How do I use this calendar?

You can click on an event to display further information about it.

The toolbar above the calendar has buttons to view different events. Use the left and right arrow icons to view events in the past and future. The button inbetween returns you to today's view. The button to the right of this shows a mini-calendar to let you quickly jump to any date.

The dropdown box on the right allows you to see a different view of the calendar, such as an agenda or a termly view.

If this calendar has tags, you can use the labelled checkboxes at the top of the page to select just the tags you wish to view, and then click "Show selected". The calendar will be redisplayed with just the events related to these tags, making it easier to find what you're looking for.

 
Tue 9 Jan, '18
-
C1.06 - Simulation Reading Group
Tue 9 Jan, '18
-
YRM - 3pm
Common Room (C0.06)
Wed 10 Jan, '18
-
Dept Council Meeting
Radcliffe House
Wed 10 Jan, '18
-
SSLC
C1.06
Wed 10 Jan, '18
-
Probability Seminars
B3.02
Thu 11 Jan, '18
-
C1.06 - Machine Learning Reading Group
Fri 12 Jan, '18
-
Algorithms and Computationally Intensive Inference seminars
C1.06
Fri 12 Jan, '18
APTS Executive Committee
C1.06
Tue 16 Jan, '18
-
C1.06 - Simulation Reading Group
Wed 17 Jan, '18
-
Teaching Committee
C1.06
Wed 17 Jan, '18
-
Probability Seminars
B3.02
Thu 18 Jan, '18
-
C1.06 - Machine Learning Reading Group
Fri 19 Jan, '18
-
Algorithms and Computationally Intensive Inference seminars
C1.06
Fri 19 Jan, '18
-
CRiSM Seminar
MA_B1.01

Jonas Peters, Department of Mathematical Sciences, University of Copenhagen

Invariant Causal Prediction

Abstract: Why are we interested in the causal structure of a process? In classical prediction tasks as regression, for example, it seems that no causal knowledge is required. In many situations, however, we want to understand how a system reacts under interventions, e.g., in gene knock-out experiments. Here, causal models become important because they are usually considered invariant under those changes. A causal prediction uses only direct causes of the target variable as predictors; it remains valid even if we intervene on predictor variables or change the whole experimental setting. In this talk, we show how we can exploit this invariance principle to estimate causal structure from data. We apply the methodology to data sets from biology, epidemiology, and finance. The talk does not require any knowledge about causal concepts.

David Ginsbourger, Idiap Research Institute and University of Bern, http://www.ginsbourger.ch
Quantifying and reducing uncertainties on sets under Gaussian Process priors

Abstract: Gaussian Process models have been used in a number of problems where an objective function f needs to be studied based on a drastically limited number of evaluations.

 

Global optimization algorithms based on Gaussian Process models have been investigated for several decades, and have become quite popular notably in design of computer experiments. Also, further classes of problems involving the estimation of sets implicitly defined by f, e.g. sets of excursion above a given threshold, have inspired multiple research developments.

 

In this talk, we will give an overview of recent results and challenges pertaining to the estimation of sets under Gaussian Process priors, with a particular interest for to the quantification and the sequential reduction of associated uncertainties.

 

Based on a series of joint works primarily with Dario Azzimonti, François Bachoc, Julien Bect, Mickaël Binois, Clément Chevalier, Ilya Molchanov, Victor Picheny, Yann Richet and Emmanuel Vazquez.

Fri 19 Jan, '18
-
CRiSM Seminar
A1.01
Tue 23 Jan, '18
-
C1.06 - Simulation Reading Group
Tue 23 Jan, '18
-
YRM - 3pm
Common Room (C0.06)
Wed 24 Jan, '18
-
Probability Seminars
B3.02
Thu 25 Jan, '18
-
C1.06 - Machine Learning Reading Group
Fri 26 Jan, '18
-
OxWaSP
C0.08

module 5: 26 January

organised by Jim Smith (Warwick) - François Caron (Oxford)

 1400-1500 Mihaela van der Schaar (Oxford Man) AutoPrognosis

 Mihaela's work uses data science and machine learning to create models that assist diagnosis and prognosis. Existing models suffer from two kinds of problems. Statistical models that are driven by theory/hypotheses are easy to apply and interpret but they make many assumptions and often have inferior predictive accuracy. Machine learning models can be crafted to the data and often have superior predictive accuracy but they are often hard to interpret and must be crafted for each disease … and there are a lot of diseases. In this talk I present a method (AutoPrognosis) that makes machine learning itself do both the crafting and interpreting. For medicine, this is a complicated problem because missing data must be imputed, relevant features/covariates must be selected, and the most appropriate classifier(s) must be chosen. Moreover, there is no one “best” imputation algorithm or feature processing algorithm or classification algorithm; some imputation algorithms will work better with a particular feature processing algorithm and a particular classifier in a particular setting. To deal with these complications, we need an entire pipeline. Because there are many pipelines we need a machine learning method for this purpose, and this is exactly what AutoPrognosis is: an automated process for creating a particular pipeline for each particular setting.

Using a variety of medical datasets, we show that AutoPrognosis achieves performance that is significantly superior to existing clinical approaches and statistical and machine learning methods.

 1530-1630 Jim Griffith (Kent)

Bayesian nonparametric vector autoregressive models

 Vector autoregressive (VAR) models are the main work-horse model for macroeconomic forecasting, and provide a framework for the analysis of complex dynamics that are present between macroeconomic variables. Whether a classical or a Bayesian approach is adopted, most VAR models are linear with Gaussian innovations. This can limit the model’s ability to explain the relationships in macroeconomic series. We propose a nonparametric VAR model that allows for nonlinearity in the conditional mean, heteroscedasticity in the conditional variance, and non-Gaussian innovations. Our approach differs to that of previous studies by modelling the stationary and transition densities using Bayesian nonparametric methods. Our Bayesian nonparametric VAR

(BayesNP-VAR) model is applied to US and UK macroeconomic time series, and compared to other Bayesian VAR models. We show that BayesNP-VAR is a flexible model that is able to account for nonlinear relationships as well as heteroscedasticity in the data. In terms of short-run out-of-sample forecasts, we show that BayesNP-VAR predictively outperforms competing models.

 

Fri 26 Jan, '18
-
Algorithms and Computationally Intensive Inference seminars
C1.06
Fri 26 Jan, '18
-
OxWaSP Mini-Symposia
MS_B3.03
Mon 29 Jan, '18
-
Assistant or Associate Professor Presentations
D1.07
Tue 30 Jan, '18
-
C1.06 - Simulation Reading Group
Tue 30 Jan, '18
-
YRM
Common Room (C0.06)
Wed 31 Jan, '18
-
PhD Open Day
C0.06, Common Room
Wed 31 Jan, '18
-
Management Group
C0.08
Wed 31 Jan, '18
-
Probability Seminars
B3.02
Thu 1 Feb, '18
-
C1.06 - Machine Learning Reading Group
Fri 2 Feb, '18
-
Algorithms and Computationally Intensive Inference seminars
C1.06

Placeholder