Skip to main content Skip to navigation

Abstracts

Willy Aspinall (University of Bristol and Aspinall and Associates)

Sea level change from melting ice-sheets under global warming-eliciting and modelling tail dependences with expert judgements

Dependence modelling in high dimensional distributions is a very complex affair. The most serious error is neglecting dependence in beliefs when in fact such dependence is present. Mathematical operations combining many variables can amplify the effects of dependence - even 'negligible' dependence becomes important when it extends over many variables. Dependence can be represented in risk analysis by using copulas. The 'normal copula' is the dependence structure of the joint normal distribution, expressed as dependence between percentiles. However, the normal copula has tail independence; that is, given that one variable is above its uth quantile, the probability that another variable is also above its uth quantile tends to the independent probability 1- u, as u → 1. In particular, Pr [Y > uth quantile | X > uth quantile] goes to zero as u goes to 1. With the normal copula joining X and Y, the take home is "if something very bad happens to X, there's no reason for extra concern about Y, even if X and Y are positively correlated". (Most commercial packages use the normal copula, and the silent assumption of tail independence has been credited with excessive risk taking on Wall Street). Expressing dependence in this way facilitates applying a dependence structure to other (non-normal) variables, and a tail dependent copula will show very different behavior. The Gumbel copula is one simple structure for capturing upper tail dependence. The conditional exceedance Pr [Y > uth quantile | X > uth quantile] converges to a constant, which parameterizes the Gumbel family. Other popular copula families are the Frank and (reverse) Clayton: the Frank is 'more tail independent' than the normal, and the reverse Clayton is more (upper) tail dependent than the Gumbel. Now learning, say, that X is above its 95th percentile raises the probability of Y also exceeding its 95th percentile, and the message from a tail dependent copula analysis is: If something bad happens to X, there is good reason to fear something bad will happen to Y, and the reasons get stronger as the coupling between X and Y increases, and as "bad" gets "worse". Inducing tail dependence through expert elicitation is described for the case of quantifying estimates of sea level rise due to ice sheets melting under global warming, and associated uncertainties. The goal of dependence elicitation from experts is quite modest: avoid the egregious errors, such as neglecting dependence.



Jean-Philippe Bouchaud (Capital Fund Management S.A. and Ecole Polytechnique)

Tipping points and crises in simple macroeconomic models

We explore the possible types of phenomena that simple macroeconomic models with interactions, frictions and heterogeneities can reproduce. We propose a methodology, inspired by statistical physics, that characterizes a model through its ''phase diagram" in the space of parameters. Through this looking glass, we investigate three stylized models (networks of interacting firms; agent based models of firms and households; and dynamical trust networks à la Marsili et al.). In each case we find generic phase transitions (or tipping points) between a ''good economy" state where unemployment/volatility are low and confidence is high, and a ''bad economy" state where unemployment/volatility are high and confidence is low. If the parameters are such that the system is close to such transitions, any small fluctuation may be amplified, leading to a large level of endogenous volatility. This can cause the monetary policy itself to trigger instabilities and be counter-productive. We identify several theoretical scenarios for synchronization and instabilities in large economies that can generate aggregate volatility and acute crises without any identifiable idiosyncratic shocks. This suggests an interesting resolution of the “small shocks, large business cycles” puzzle.


Giulia Iori (City University London)

Agent based modeling approaches to Systemic Risk

The 2007-2008 Financial and Economic Crisis highlighted the importance of interconnectedness among financial institutions and markets and the inadequacy of pre-crisis supervisory regimes that focused on micro-prudential regulation, i.e. monitoring financial stability at the level of individual financial institutions, and neglected macro- prudential regulation, which would directly target systemic instability by focusing on the interconnectedness of the system. A strict objective of maximizing stability at the level of individual institu- tions can indeed have the unexpected effect of decreasing systemic stability. . In this talk I will present an overview of my research addressing the role of the interbank markets in both promoting and undermining systemic stability of the banking system. The approach is based on a combination of empirical analysis of interbank networks data and the development of Agent Based models ABMs have the advantage of simplifying behavior at the individual level by assuming that agents follow given but evolving rules-of- thumb, and this allows them to explore the multiplicity of agent types and their set of inter-connections in far greater detail. In particular ABMs can follow the behaviour of agents in rapidly evolving dynamic settings and see how this both determines and is determined by the emergence of crises and collapses. I will showed that when banks are homogenous in size and risk characteristics, the interbank market acts as an effective shock absorber for individual fluctuations in liquidity needs. But when banks are heterogeneous, contagion effects may arise. Direct knock-on contagion driven by creditor defaults, while increasing with connectivity, only explains a small percentage of the overall failures. Rather simultaneous defaults arise spontaneously as the system reaches a critical state by its own intrinsic dynamics. Instability builds up as liquidity is depleted from the system, leading to funding contagion, in a fashion that resembles self- organized criticality in physical systems.


Gordon Woo (Risk Management Solutions)

Counterfactual analysis of extreme events

A standard starting point for forecasting extreme events is a catalogue of past historical events. But from a physicist’s perspective, history is just one realization of what might have happened. The historical past is not pre-ordained, but is subject to stochastic variability. Stochastic simulation is often undertaken prospectively to generate future extreme event scenarios. Stochastic simulation can also be undertaken retrospectively to generate past extreme event scenarios. This is the basis of counterfactual disaster analysis.
Signals of hidden future disasters are embedded in the past. Historical event catalogues are generally treated deterministically as being a fixed data platform upon which future statistical analysis may be conducted. However, the sparseness of extreme event datasets severely limits the forecasting capability of such analyses. Through stochastic simulation of the past, much deeper risk insight into disaster phenomena is achievable. In particular, dynamical perturbations that drive a system to disaster can be understood, and surprising, hitherto unknown, apparently unforecastable events may be anticipated and modelled, and correlations better understood. Examples are given of the application of counterfactual disaster analysis to a broad diverse range of natural and man-made disasters