Skip to main content Skip to navigation

APTS module: Statistical Computing

Module leader: Matteo Fasiolo and Anthony Lee

Please see the full Module Specifications for background information relating to all of the APTS modules, including how to interpret the information below.

Aim: To introduce, in a practical way, the fundamentals of numerical computation for statistics, in order to help students to write stable, fast and numerically accurate statistical programs.

Learning outcomes: After taking this module students will

  • understand the importance of stability, efficiency and accuracy in numerical computations, and how these may be promoted in practical statistical computation;
  • understand the main difficulties and other issues that arise in the topics given below;
  • be aware of standard computational libraries and other resources.

Prerequisites: In preparation for this module, students should obtain an elementary knowledge of the use of R. (Knowledge also of a lower level language such as C, Pascal or Fortran would be an advantage but will not be presumed.) Preparation for this module should also (re-)establish familiarity with Taylor's theorem and with basic matrix algebra — e.g., notion of an inverse and eigenvalues, manipulation of matrix expressions, the numerical unsuitability of Cramer's rule for computation of an inverse.

Further reading:

  • Lange, K. (2010). Numerical Analysis for Statisticians, second edition, Springer.

Topics:

  • Finite-precision arithmetic; related types of error and stability (probably mostly covered, in context, as part of other topics).
  • Numerical linear algebra (with statistical applications): basic computational efficiency, Choleski, QR, stability (e.g. Normal/Choleski vs QR for LS), eigen and singular value decompositions. Standard libraries.
  • Optimization: Newton-type methods; other deterministic methods; stochastic methods; using methods effectively in practice; what to use when.
  • Differentiation and integration by computer: finite differencing (interval choice, cancellation and truncation errors); automatic differentiation; quadrature methods; stochastic integration.
  • Basics of stochastic simulation.
  • Other types of problem (e.g. sorting and matching); the pervasiveness of efficiency and stability issues; where to find out more.

Assessment: A short project bringing together several of the topics covered. For example writing a routine to estimate a linear mixed model by (RE)ML.