Skip to main content Skip to navigation

CRiSM Master Class on Sparse Regression

30 August - 1 September 2016

Registration is closed


Tentative Schedule

Tuesday 30 August 09.30-10.30 Registration and coffee
  10.30-12.00 Lecture
  12.00-13.30 Lunch break
  13.30-15.00 Lecture
  15.00-15.30 Coffee break
  15.30-17.00 Lecture
  17.00-18.30 Evening reception and poster session
Wednesday 31 August 10.00-10.30 Coffee
  10.30-12.00 Lecture
  12.00-13.30 Lunch break
  13.30-15.00 Lecture
  15.00-15.30 Coffee break
  15.30-17.00 Lecture
  18.00-21.00 Masterclass Dinner (Radcliffe)
Thursday 1 September 10.00-10.30 Coffee
  10.30-12.00 Lecture
  12.00-13.30 Lunch break
  13.30-15.00 Lecture
  15.00-15.10 Closing Remarks

Included in the registration fee are coffee breaks and lunch for each day, and wine and refreshments during the poster session. There will also be a gala dinner available for attendees for an additional £36.75. Accommodations are not included with the fees, but they can be booked during the registration process. Parking space can be arranged free of charge here.

Local participants (Warwick Maths/Stats) are required to register free of charge by sending an e-mail to Olivia dot Garcia at warwick dot ac dot uk. Indicate whether you wish to participate in the optional dinner at £36.75, and any dietary requirements.

No cancellation charges apply if booking is cancelled before 16th August 2016, you will be refunded in full. If booking is cancelled after 16th August 2016 or in case of a no show there will be no refund and full charges will apply. Refunds will take 3-5 working days to process and be credited to the account that was used to pay for the booking.


High-dimensional statistical inference, by Richard Samworth (Cambridge):

Lecture 1: Classical theory

Review of the linear model, geometry of orthogonal projections, hypothesis testing, examples. Ridge regression.

Lecture 2: Modern high-dimensional statistics

Model selection via, e.g., information criteria. Definition of the Lasso and its geometry. Uniqueness of Lasso solutions. Prediction and estimation properties.

Lecture 3: The Lasso and beyond

Selection consistency of the Lasso. Computation. Brief discussion of other penalty functions, other models, e.g. grouped variables.

Lecture 4: Multiple testing and Complementary Pairs Stability Selection

Familywise error rate and Bonferroni procedure; False Discovery Rate and Benjamini--Hochberg procedure. Brief discussion of extensions. I will then use the remaining time to give an exposition of some of my own work in this area. Complementary Pairs Stability Selection is a technique for improving the performance of any existing variable selection algorithm, by aggregating the results of applying it on subsamples of the data.

Sparse Statistical Learning and Inference, by Jianqing Fan (Princeton):
High-dimensionality and Big Data characterize many contemporary statistical problems from genomics and genetics to finance and economics. We first outline a unified approach to ultrahigh dimensional variable selection problems and then focus on penalized likelihood methods which are fundamentally important building blocks to ultra-high dimensional variable selection. We will also introduce variable screening methods as well as methods for guarding against suprious discoveries. Algorithms for solving penalized likelihood methods as well as Big Data computing will also be introduced. Topics to be covered include

1. Sparse Learning via penalization
1.1 Introduction
1.2 Penalized Quasi-likelihood
1.3 Properties
1.4 One-Step Estimator
1.5 Analysis of Decomposable Regularization

2. Screening and Selection
2.1 Independence Screening
2.2 Iteratively Independent Learning
2.3 Conditional Sure Independence Screening

3. Control of Algorithmic Complexity and Statistical Error
3.1 Introduction
3.2 Overview of Gradient Methods
3.3 Iterative Local Adaptive MM
3.4 Theoretical Properties
3.5 Numerical results

4. Distributed Estimation and Inference
4.1 Introduction
4.2 Summary of Ideas and Results
4.3 Distributed statistical inference
4.4 Distributed estimation
4.5 Simulation performance



Richard Samworth (University of Cambridge)

Richard Samworth obtained his PhD in Statistics from the University of Cambridge in 2004. After a Research Fellowship at St John's College, he joined the Statistical Laboratory as a Lecturer in 2005, while remaining a Fellow of St John's. He became Professor of Statistics in 2013.

Richard's main research interests are in nonparametric and high-dimensional statistics. Particular topics include shape-constrained density and other nonparametric function estimation problems, nonparametric classification, clustering and regression, Independent Component Analysis, the bootstrap and high-dimensional variable selection problems.


Jianqing Fan (Princeton)

Jianqing Fan is Frederick L. Moore Professor of Finance, Professor of Statistics, Former Chairman of Department of Operations Research and Financial Engineering, and Director of Committee of Statistical Studies at Princeton University, where he directs both financial econometrics and statistics labs. He was the past president of the Institute of Mathematical Statistics and International Chinese Statistical Association. He was invited speaker at the 2006 International Congress of Mathematicians and a core member of the committee on selection of the invited speakers for Probability and Statistics for the 2010 International Congress of Mathematicians. He is co-editing Journal of Econometrics and is an associate editor of Econometrica and Journal of American Statistical Association, and was the co-editor of The Annals of Statistics, Probability Theory and Related Fields and Econometrics Journal. After receiving his Ph.D. from the University of California at Berkeley, he has been appointed as assistant, associate, and full professor at the University of North Carolina at Chapel Hill (1989-2003), professor at the University of California at Los Angeles (1997-2000), and professor at the Princeton University (2003--). His published work on statistics, economics, finance, and computational biology has been recognized by The 2000 COPSS Presidents' Award, The 2007 Morningside Gold Medal of Applied Mathematics, Guggenheim Fellow in 2009, P.L. Hsu Prize in 2013, Royal Statistical Society Guy medal in silver in 2014, and election to Academician of Academia Sinica and follow of American Associations for Advancement of Science. His research interest includes financial econometrics, portfolio selection, and risk management.

Please send queries to Joris Bierkens (j dot bierkens at warwick dot ac dot uk) or Olivia Garcia-Hernandez (Olivia dot Garcia at warwick dot ac dot uk).