Skip to main content Skip to navigation

statement

(UPDATE: For examples of my most recent work, please use the links to project reports on the left of this page.)

(All students are well advised to put some content on their webpage. But the DTC has kept me busy and I have written nothing recently. So join me on a trip back to the last days of 2008 :) you find me living in Camden Town, bussing every day to work at a hedge fund in the City, and writing this "statement in support" for PhD applications...)

As I write, the media continue to assail us with stories of 'financial meltdown'. Perversely, it is the 'freezing' of credit lines that we are asked to blame. It does seem that the commentators reckon on some sort of phase transition. But they are less clear on its precise nature.

I should like to engage in work designed to shed light on this issue. Advances in statistical mechanics and network analysis have equipped us with a valuable set of tools for the study of complex systems, be they inside or outside the traditional domain of physics. It is extremely tempting to regard a marketplace as just another many-particle system. To what extent can an analysis along such lines aid us in understanding the market's fluctuations?

The market is a creature whose study naturally falls to economics. That subject has a rich history of argument as to its nature and as to the best means of its regulation. But events have underlined the fact that no single viewpoint in economics holds a position of unchallenged dominance. Is short-selling inherently a bad thing? Can that months-long oil price high be blamed on uncontrolled speculation? Our government is newly Keynesian, while booksellers report a resurgence of interest in Marx! The debate is reinvigorated--and we may hope that through new insight it is possible to influence its direction.

It has been many years since mathematicians and natural scientists acknowledged that complex nonlinear systems can, and frequently do, give rise to behaviour dramatically different to the range of possibilities suggested by equilibrium analysis. Moreover since the advent of tools such as generating functional analysis that allow us directly to examine the dynamics of large systems, we are free to consider the larger class of non-equilibrium models.

My own experience of the financial world is a function of my work as a quantitative analyst in the City, generating the numbers used by traders and risk managers for the pricing and hedging of derivatives. My employer has been kind enough to sponsor me through Paul Wilmott's Certificate in Quantitative Finance (CQF) and Level I of the Chartered Financial Analyst (CFA) programme.

I have learned that industry practice allows, and even expects, that we ignore the elephant in the room--that is, that we ignore the overwhelming evidence that our most basic assumptions regarding the evolution of prices are flawed. Even the most elaborate models, of the type used to price complex correlation products, cling to the Gaussian distribution. One consequence is that recent times have been witness to such anomalous spectacles as implied correlation in excess of unity. It was only to be expected that when troubles came, the inadequacy of these models would come in for criticism from those in the know. A letter in last week's Financial Times uses a story to illustrate:

A mathematician is looking for his lost wallet under the only lamppost that is lit on a dark street. A passing man enquires: "Just where did you lose your wallet?" The mathematician points into the darkness and says: "Over there." The passer-by is puzzled and asks: "Then why are you looking for it here?" The mathematician replies: "Because here there is light."

But there is less of an excuse than ever before to confine the search in this way. To the harassed industry quant, power laws and volatility clustering may seem no more than mathematical inconveniences. But to the investigative scientist, they offer a jumping-off point into a fascinating discussion of the true mechanisms underlying market dynamics: 'Here is behaviour of a type we see near phase transitions, might we be looking at self-organised criticality?'

The market is a difficult object of study. Just as in models of physical or biological systems, it is necessary to make simplifying assumptions at the microscopic level. But in a market model, these simplifications are likely to be more dramatic by orders of magnitude, so that we run genuine risk of losing connection to the observed object. It may be comparatively easy to work more intricacy into a model intended only to live on a computer. But while such a model may be helpful during preliminary investigations, it is likely to prove resistant to analytical treatment and therefore be limited in its contribution to real understanding. So our challenge is to develop models that incorporate sufficient detail to reproduce those features or stylised facts in which we are interested, but that remain susceptible to an analysis of depth--one through which it becomes possible to identify and to quantify both key characteristics of the model and key phenomena emergent in its operation.

Models that develop the Minority Game have promise in this regard. The basic MG does indeed represent a simplification of dramatic proportions. However what it retains are those features that might plausibly be identified as the drivers of excess volatility. Speculative strategies of an entirely technical nature, heterogeneous and exemplifying bounded rationality, compete in a system equipped with the minority mechanism. This mechanism frustrates the rise to dominance of any single strategy--it is supposed to represent the dangers of the crowded trade, the frustration inherent in a competition for limited resources. Remarkably perhaps, the MG retains a high degree of tractability. It may be discussed in the language of correlation and response functions. Variations that build on the results of the basic model, as if in a perturbative fashion, may lead to more realistic descriptions.

I have held an interest in the science of complexity for some years, since first studying the mathematics of neural networks during my Master's degree. My dissertation, conducted under the supervision of Prof ACC Coolen, built on his work in large systems of coupled oscillators. It considered a generalisation of the Kuramoto model in which interactions between pairs of oscillators are characterised by a value α representing the preferred phase relation. Its conclusion presented an original analytic expression for the free energy in terms of arbitrary α.

Subsequent work in tasks of pattern recognition has allowed me to pursue my interest. At the Department of Work and Pensions and at Moneybox (a deployer of ATMs), I developed learning machines that would estimate outcomes based on large sets of characteristics--of benefits claimants in the first case and of ATM sites in the second.

Other work has seen me dabble in social networks (such as might be used to model information flows in a market economy). At the BBC I designed and implemented an immersive interface through which the user could interact with the social graph in real time. It used an open source 3D engine to visualise a virtual world of friends and associates, with a spatial structure determined by interaction strengths through a visibly ongoing process analogous to simulated annealing. Earlier, my work at Rebellion (a maker of computer games) had seen me design and build a sort of proto-MySpace for handheld consoles. Profiles and comments would diffuse through the network of GameBoy-equipped children in a series of one-to-one exchanges using the built-in infrared comms.

In this field we are often lucky enough to have access to large volumes of relevant historical data, and I might expect to find use for my experience in handling large data sets on a computer. I have programmed in both Windows and Unix environments, and am conversant with a range of technologies such as C++, Java, SQL, Excel/VBA and J (an APL-like language).