Skip to main content Skip to navigation

Full Abstracts

Friday July 11th:

  • 10.00am - 11.00am, Jesper Andreasen, Danskebank

 "Soft Markets: Feedback Effects from Dynamic Hedging"

 

  • 11.30am - 12.30pm, Pat Hagan, JPMorgan Chase

"Practical Calibration Techniques for Exotic Options"

Mathematical models cannot be used to price exotic deals until the mathematical parameters have been set through the calibration process. In the calibration process, one first selects a set of vanilla instruments whose market prices are known moment by moment through the day. One then sets the model parameters by requiring that a) the model matches today's discount curve exactly, and b) the model's predicted prices of the vanilla instruments match their market values, either exactly or in a least squares sense. Then the model can be used to price the exotic deal. So, in a way, the model is "interpolating" the price of the complex exotic from the vanilla instruments used in calibration. Here we investigate different strategies for choosing the calibration instruments, and how this affects the final price and hedges of the exotics.

 

  • 2.00pm - 3.00pm , Nasir Afaf, Commerzbank

"Non-linearities in Finance"

 

 

  • 3.00pm - 4.00pm, Claudio Albanese, Level3Finance

"Long-Dated Derivatives"

Long dated derivatives require a flexible modeling framework. The econometrics challenge is to embed historical and cross-sectional estimations into derivative calibration. The engineering challenge is to structure a model agnostic pricing engine whose performance depends only on the model size but not on the process specification. The mathematical and numerical challenge is to understand and use the smoothing mechanisms behind diffusion equations. We illustrate through examples an efficient framework of this sort based on direct kernel manipulations and operator algebraic methods. We find that fully explicit discretization schemes provide a robust, low-noise numerical valuation method for fundamental solutions of diffusion equations and their derivatives. Path dependent options are associated to an operator algebra and can be classified into Abelian and non-Abelian: block-diagonalizations and moment methods apply to the first and block-factorization to the latter.

Direct kernel manipulations also allow one to correlate lattice models by means of dynamic conditioning across even hundreds of factors without incurring into the curse of dimensionality. Thanks to the internal smoothing mechanisms, calculations are best executed in single precision floating point arithmetics and staggering performance can be achieved by invoking BLAS Level-3 routines on massively parallel chipsets such as GPUs and the Cell BE. Examples to be discussed include the swaption volatility cube calibration, CMSs and CMS spreads, snowballs, synthetic and bespoke CDOs, long dated equity structures and volatility derivatives. The list is long, but the model-agnostic math is just the same.

 

  • 4.30pm - 5.00pm, Chris Kenyon, Depfa Bank

"Pricing Strongly Path-Dependent Options in Libor Market Models without Simulation"

Path-dependent options in Fixed Income are typically priced using Monte-Carlo implementations of libor market models (LMM). We present an alternative to Monte-Carlo that scales with the number of underlying factors driving the LMM for typical strongly path-dependent options in Fixed Income such as as target range accrual notes (TARNs), and Snowblades. Our method is a generalization of that developed by [HW04] for pricing CDOs without simulation. This works by conditioning on the driving factors so that the distributions of the underlying at different times are independent. Payoff distributions are built up using an iterative scheme. The option price is then calculated by integrating over the driving factors. We extend this method to strongly path-dependent payoffs (e.g. TARNs and Snowblades) in the context of LMMs by adapting the iterative build scheme and including successive measure changes. Unlike the CDO case, TARN and Snowblade payoffs depend on the whole of the previous history. The main limitation of the extended method with the LMM is that measure changes for the drift are given by the usual freezing-the-forwards approximation.

This extended method is particularly appropriate in the Fixed Income world because the observations and payments are usually at discrete times. Discretely observed analytic option pricing schemes from the Equity world are not appropriate because of the successive measure changes required by our scheme (and in general for Fixed Income). In the specific case of TARNs and Snowblades the conditional coupon payoff distributions can be expressed analytically. The method can also be combined with Monte-Carlo for integrating over the driving factors, when there are, say, 4 significant eigenvalues to the driving matrix. The advantage over straight Monte-Carlo lies in the summarizing capacity of the conditional distributions and is dependent on the ease with which these can be calculated. We also describe an extension of the model to include volatility smiles.

 

  • 5.00pm - 5.30pm, Paul Schneider, Vienna University of Economics and Business Administration

"Flexing the Default Barrier"

We introduce a generalized Black&Cox-type structural model for defaultable claims in which the default barrier is allowed to vary freely with maturity. The model is an extension of the work by e.g. Brigo and Morini (2006), who choose a specic form for the default barrier for tractability. Our paper proposes a more exible setup featuring an arbitrary deterministic default boundary function. Our approach uses the Green function to solve the Black-Scholes pricing PDE with time-dependent coecients and boundary conditions, and originates from the numerical valuation of barrier options (cf. Rapisarda, 2003). We develop a computationally ecient and precise numerical calibration procedure which is tailored to the CDS calibration problem and exhibits attractive numerical stability. Our method accommodates calibration of CDS consistent with both the option-implied volatility term structure and the current proportion of debt.

Since it is desirable to price consistently and calibrate sequentially across markets this feature of our methodology presents a strong advantage over existing models. With exogenous estimates for asset volatility, dividend yield and interest rates the implied default barrier is calibrated such that a given maturity cross section of CDS contracts is perfectly tted. We take our model to market data and apply our method to CDS contracts written on two exemplary obligors yielding the following stylized ndings: Level and slope of the CDS-implied default barrier are related to market perceptions of changes in the future nancial situation of an obligor. Fluctuations in the implied barrier function on consecutive days are small, indicating in particular the stability of our method.

The results show that the barrier structure employed in e.g. Brigo and Morini (2006), implying exponential growth, is not satisfactory and could possibly turn out misleading when describing the term structure of market expectations because the boundary is shown to take on downward humped shapes as well, implying slower growth in conditional default probabilities over certain periods. Firm leverage emerges (through asset volatility) as a stronger in uence on the default barrier than equity volatility in the short term. It seems not to have an effect in the long term, though. Longer maturities in the implied default barrier are primarily aected by movements in CDS spreads. Several applications of both practical and academic interest arise naturally with our model. Survival probabilities computed within our framework reflect information contained in equity volatility, interest rates, CDS premia, and capital structure. They are therefore suitable for pricing and fair-value accounting, and particularly for empirical studies concerning equity and default risk at the same time. Finally, the analytic framework together with the numerical solution algorithm developed in this paper can be extended to accommodate optimal-control problems, such as the determination of optimal debt covenants or endogenous renancing times.

 

  • 5.30pm - 6.00pm, Anthony Neuberger,University of Warwick

"Robust Hedging of American Options"

The essential feature of American style claims lies in the holder's right to time the exercise decision. The value of the claim depends on the information about future prices that the holder will acquire over time. Much of the literature makes restrictive assumptions about information revelation * for example that the underlying price process is Markov. This paper explores the upper bound on the price of an American option, placing no assumptions on the information structure. The analysis provides insight into the processes that make the American feature valuable, and points the way to hedging strategies for American options that are robust to model error.

Saturday July 12th:

  • 10.00am - 11.00am, Nick Webber, University of Warwick

"Implementing numerical methods for complex options"

 

  • 11.30am - 12.30pm, Uwe Wystup, Mathfinance AG

"Pricing of First Generation Exotics with the Vanna-Volga Method: Pros and Cons"

The vanna-volga method, also called the "traders' rule of thumb" is an empirical procedure that can be used to infer an implied-volatility smile from three available quotes for a given maturity. It is based on the construction of locally replicating portfolios whose associated hedging costs are added to corresponding Black-Scholes prices to produce smile-consistent values. Besides being intuitive and easy to implement, this procedure has a clear financial interpretation, which further supports its use in practice. In fact, SuperDerivatives has implemented a type of this method in their pricing platform, as one can read in the patent that SuperDerivatives has filed.

The VV method is commonly used in foreign exchange options markets, where three main volatility quotes are typically available for a given market maturity: the delta-neutral straddle, referred to as at-the-money (ATM); the risk reversal for 25 delta call and put; and the (vega-weighted) butterfly with 25 delta wings. The application of vanna-volga pricing allows us to derive implied volatilities for any option’s delta, in particular for those outside the basic range set by the 25 delta put and call quotes.

In the financial literature, the vanna-volga approach was introduced by Lipton and McGhee (2002 who compare different approaches to the pricing of double-no-touch options, and by Wystup (2003), who describes its application to the valuation of one-touch options. The vanna-volga procedure is reviewed in more detail and some important results concerning the tractability of the method and its robustness are derived by Castagna and Mercurio (2007)

 

  • 2.00pm - 2.30pm, Antoine Jacquier, Imperial College London

"Spectral Theory for Diffusion Process: Application to Pricing and Calibration of Stochastic Volatility Models"

We are interested here in two applications of spectral theoryrst, when considering time-changed Levy processes, it is not always possible to obtain the density of the process in closed-form ; spectral theory does provide such formulae (at least semi-closed forms). We will in particular mention time-changed Ornstein-Uhlenbeck, CIR and CEV processes. Another very useful application of spectral theory is to obtain asymptotic result for stochastic volatility process. More specically, we will focus on getting the asymptotic volatility smile within the Heston framework and show how useful this result is for calibration purposes

 

  • 2.30pm - 3.00pm, Andrea Pascucci, University of Bologna

"Analytic Valuation by Parametric Approximations"

In this talk we discuss the possible application of the parametrix method to the problem of pricing and hedging derivatives securities. As it is well-known, under standard hypotheses, the price of an European option evolves according to a parabolic PDE and can be expressed in terms of the convolution of the corresponding fundamental solution with the payoff function. As a matter of fact fundamental solutions are explicitly known only for a rather small set of models. Among these the most relevant cases are the arithmetic and geometric Brownian motions (Gaussian and lognormal densities), the general linear case (a±ne models studied e.g. by Duffie, Filipovic and Schachermayer in [3]), the square root process (Feller [4], Cox, Ingersoll and Ross [2]) and classes of models derived via transforms from these models (see e.g. Albanese [1]). In view of the paramount advantages, both in terms of understanding and computation time, given by the existence of an analytical solutions, actual modeling has largely been restricted to this rather small set of diffusions. On the other hand the analytical tractability of these models is not accompanied by good statistical properties in the sense that the distributions implied by these models give poor fit to actual market data. This has implied a growing interest for models whose solution can be computed only by numerical methods (deterministic or Monte Carlo based).

A major problem which severely limits the use of these models is that, while their practical relevance has been found in the valuation of exotic or very far from the money vanilla options, the numerical burden implied by their use for such payoffs is still by far too big to allow widespread application. It is to be noticed that such a burden can be excessive even in the case of standard model when applied to the computation of hedging parameters for some exotic payoff. Even if we do not consider the numerical problem, a second relevant obstacle to the implementation of more statistically satisfactory but less tractable models is that the lack of analytical solution severely restricts the ability of the practitioner to understand, pending the reaction times allowed by the market, the implications of a given model and its possible weak points. This is relevant, in particular, when a real time position risk management is required. A third and connected problem with analytically untractable models is that they do not allow for easy valuation of the consequences of model misspecification.

In an applied milieu where model risk management is becoming a central portion of the financial decision making process, such weakness is rapidly becoming an heavy burden for elastic but not tractable models. Two possible ways out of this problem can be suggested. The first one is to extend the class of analytically solvable models by the use of properly chosen transforms; the second one is to develop tools capable of calibrating analytically computable approximations to non-analytically computable models and to evaluate the error of these approximations. This talk is concerned with the second alternative. We modify and adapt a classical tool in PDE theory, the parametrix expansion, to build approximate fundamental solutions to a generic parabolic PDE using as starting point the explicit solution of a simpler parabolic PDE. The approximation can be truncated to any number of terms an easily computable error measures are available.

 

  • 3.00pm - 3.30pm, Linus Kaisajunntti, Stockholm School of Economics

"An n-Dimensional Markov-Functional Interest Rate Model"

This paper develops an n-dimensional Markov-functional interest rate model, i.e. a model driven by an n-dimensional state process and constructed using Markov-functional techniques. It is shown that this model is very similar to an n-factor LIBOR market model hence allowing intuition from the LIBOR market model to be transferred to the Markov-functional model. This generalises the results of Bennett & Kennedy, [1] from one-dimensional to n-dimensional driving state processes. The model is suitable for pricing certain type of exotic interest rate derivative products whose payoffs depend on the LIBORs at their setting dates. Specifically we investigate the pricing of TARNs and find that the n-dimensional Markov-functional model is faster and can be calibrated more easily to a target correlation structure than an n-factor LIBOR market model.

 

  • 4.00pm - 4.30pm, Matthew Dixon, Stanford University

"Calibrating Spread Options using a Seasonal Commodity Forward Model"

We describe the calibration of spread options on power and natural gas forward prices whose dynamics are described by the Borovkova & Geman (2006) commodity forward price model. This two-factor model resolves the stochastic dynamics of the average value of the forward curve and the term structure of convenience yields, the latter of which incorporates seasonal effects. Capturing the seasonal volatility term-structure from historical prices is essential for accurate forward curve construction and subsequent option pricing. An attractive feature of the seasonal forward model is that the covariance of the forward dynamics is quadratic in the volatility term-structure and can be effectively calibrated to historical data using a non-linear gradient based constrained optimization algorithm. We show the comparative effect on calender spread and heating rate option prices through resolving the full term-structure of seasonal forward volatilities.

 

  • 4.30pm - 5.00pm, Tim Siu-Tang Leung, Princeton University

 "Employee Stock Options: Accounting for Optimal Hedging, Early Exercises and Contractual Restrictions"

Employee stock options (ESOs) have become an integral component of compensation in the U.S. In view of their significant cost to firms, the Financial Accounting Standards Board (FASB) has mandated expensing ESOs since 2004. The main difficulty of ESO valuation lies in the uncertain timing of exercises, and a number of contractual restrictions of ESOs further complicate the problem. We present a valuation framework that captures the main characteristics of ESOs. Specifically, we incorporate the holder's risk aversion, and hedging strategies that include both dynamic trading of a correlated asset and static positions in market-traded options. Their combined effect on ESO exercises and costs are evaluated along with common features like vesting periods, job termination risk and multiple exercises. This leads to the study of a joint stochastic control and optimal stopping problem. We find that ESO values are much less than the corresponding Black-Scholes prices due to early exercises, which arise from risk aversion and job termination risk; whereas static hedges induce holders to delay exercises and increase ESO costs.

 

  • 5.00pm - 5.30pm, Naoufel El Bachir, University of Reading

"Conditional Sampling for Jump Processes with Levy Copulas

I present a conditional sampling approach to simulate paths of dependent positive jump processes. A constituent process is obtained by a series of jumps ordered by decreasing magnitude. Simultaneous jumps of dierent constituents are drawn by successively sampling a jump in each process conditionally on the values of the jumps of the other processes already generated. I consider both the inverse Lévy method and acceptance-rejection methods. The conditional sampling depends on the Lévy copula of the multidimensional jump process. I apply the method to the class of Archimedean-Lévy copulas and in particular to four Lévy copulas of this family, and present as an application a stochastic default intensity model with dependent jumps.