Coronavirus (Covid-19): Latest updates and information
Skip to main content Skip to navigation

Summary and Slides of Talks

Summary

On the 15th and 16th of September, 2016, the CRiSM workshop on "Contemporary Issues in Hypothesis Testing" was held in the University of Warwick campus. This was made possible by the generous support provided by CRiSM, including help with administrative issues, provided
kindly by Olivia Garcia. The internal organisers of the workshop were Sara Wade and Julia Brettschneider while the external organiser was Dalia Chakrabarty.

By virtue of the title, the workshop declared interest in contemporary issues regarding hypothesis testing. However, this did not deter us from looking at the history and evolution of the subject--from Popperism to Intrinsic Bayes Factors, and the thereafter. Such history appears challenged though, as example groups of scientists--such as String Theorists--reject the relevance of one of its crucial components, namely, the route comprising falsification upon empirical evidence gathering, that is considered irrelevant to the paradigm that String Theorists work within, given the inherently unobservable nature of the predictions of their theories. After all, we cannot observe multiple universes, and extra dimensions. Then, it is the "elegance" of such theories that some have advanced as the parameter driving acceptability--this has not gone down very well with everybody, including some Cosmologists. But such opposition notwithstanding, a general argument is sometimes framed against the falsification-upon-observation route, by stating that this does not tally with the scientific method, since scientists do not pick on a theory and find evidence against it--rather that scientists champion a theory (often their own), and attempt collecting evidence in support of it. Such an argument is not foolproof though. Bayesians appreciate that it is possible for a theory to be more plausible than an alternative, even when some observational inidicator suggests otherwise; for them, of relevance, is the comparison of evidence for one theory with that for another.

The workshop included invited talks on methodological issues of Hypothesis Testing and model selection, as well as talks on applications--made in diverse areas such as clinical trials, brain
imaging, cognitive science, particle physics, social science.

Slides

Christian Robert - slides
Nick Chater - slides
Tom Nichols - slides
Andrew Gelman - slides
David Draper - slides
Jim Berger - slides
Susan Ellenberg - slides
Anne-Laure Bouslateix - slides
Alexandra Carpentier - slides
Louis Lyons - slides
Joris Mulder - slides