Calendar
DR@W Forum: Matthew Cashman (WBS. Behavioural Science Group)
Cultural evolution is changing humanity much faster than genetic evolution, but at present we lack a way to empirically ground models in a quantitative, content-agnostic way analogous to counting alleles in models of genetic evolution. Measuring what information ends up in which minds is a necessary first step to explaining how that information got there. A quantitative view of what information ends up in which minds permits modeling of the many different processes that govern its flow, from informational legacies left to descendants to sharing on social media. To address this gap, we take Shannon’s classic cloze-completion game for estimating the entropy of written language and turn it on its head: instead of using minds to learn about written language, we use language to learn about minds. Entropy estimates generated based on a test set from e.g. Harry Potter will differ between a treatment group (Readers, people who have read Harry Potter), and a control group (Non-Readers). This difference is driven by the way their minds have been changed by reading the book. It is an expression, in bits, of how much information from the book is actually stored in Readers' minds and capable of influencing behavior.