10.00 – 10.30
Informatic versus Thermodynamic Entropy Production in Active Matter
Michael Cates (Cambridge)
Stochastic thermodynamics gives the steady-state entropy production rate (EPR) of a system connected to a heat bath as the log ratio of probabilities of forward and time-reversed trajectories. Extending this result to coarse-grained models of systems much further from equilibrium results in an "informatic" EPR (IEPR) that depends only on order parameter dynamics and is no longer is connected with microscopic heat flow, but remains a spatially resolved quantifier of mesoscopic irreversibility. When the same coarse-grained models describe more microscopic processes (such as phase separation within a biological cell), a connection to heat flow should be recoverable. To achieve this we embed the coarse-grained model into a larger description that is governed by linear irreversible thermodynamics. All the active terms in the order parameter dynamics then become off-diagonal elements of an Onsager matrix whose symmetry determines the remaining chemical couplings and thus the full heat production. This exceeds the IEPR by a chemical dissipation term that contains complementary spatial information to the IEPR itself.
10.30 – 11.00
Robert Mackay (Warwick)
EPSRC were greatly impressed when Robin told them our doctoral training centre was trying to formulate laws of society. Here we address a more manageable part of it, presenting an approach to macroeconomics based on concepts from thermodynamics.
Work has been done in this direction before, notably by Samuelson and by Georgescu-Roegen, yet we feel crucial aspects have been missed. We provide them here, following an axiomatic approach to thermodynamics by Lieb and Yngvason.
We explain how plausible assumptions lead to the definition of an economic analogue of entropy, which we consider to be an aggregate utility, and an economic temperature, whose inverse is the marginal utility of money. We obtain a second law of economics. We illustrate the theory with an economic Carnot cycle, that makes money out of temperature differences. We draw some conclusions about the driving forces behind trade and manufacture, and we discuss extensions that may be required to our theory.
Joint work with N. Chater (Warwick)
11.15 – 11.45
The Amontons-Coulomb friction laws in fibrous materials -- why clothes don't fall apart
Patrick Warren (STFC Hartree Centre)
The problem of how staple yarns transmit tension is addressed within abstract models in which the Amontons-Coulomb friction laws yield a linear programing (LP) problem for the tensions in the fiber elements. We find there is a percolation transition such that above the percolation threshold the transmitted tension is in principle unbounded. We determine that the mean slack in the LP constraints is a suitable order parameter to characterize this supercritical state. We argue the mechanism is generic, and in practical terms, it corresponds to a switch from a ductile to a brittle failure mode accompanied by a significant increase in mechanical strength.
13.00 – 13.30
The role of entropy in biological liquid-liquid phase separation
Oliver Dyer (Warwick)
Animal cells organise some of their constituents via liquid-liquid phase separation, without needing the lipid bilayer membranes that enclose the cell and its organelles. The droplets formed this way vary in role and composition but share an over-abundance of proteins containing flexible ‘intrinsically disordered regions’ (IDRs). These IDRs have a large conformational entropy - a key ingredient in the separation of polymers and colloids in depletion flocculation - raising the question of whether IDRs evolved to exploit this entropy to induce droplet formation.
To investigate this, we consider an environment of biomolecules represented by flexible polymer chains and low-entropy hard spheres. The IDR-containing proteins are then modelled as ‘tadpoles’ with a polymer tail grafted to a hard sphere head. By simulating mixtures of these molecules, we show that IDRs do indeed encourage entropic phase separation and that the tadpole molecules behave as surfactants in these systems. Finally, we assess the importance of entropy in biological phase separation by comparing our molecule concentrations with those inside cells.
13.30 – 14.00
From Persistent Mutual Information to data-driven modelling: a sample path
Marina Diakonova (Banco de Espana)
In 2007 I joined the then newly created Complexity Science Doctoral Training Centre, doing a PhD on the theory of complexity with Robin Ball and Robert MacKay. Now, like many of my former colleagues, I am a 'data scientist'. In my talk I describe this transition, and focus on how the training in analysing complex systems made this process possible. I will also take this opportunity to reflect on the challenges of interdisciplinarity.
14.15 – 14.45
How many samples are enough? Simulated Annealing to optimise complex systems under noise
Jurgen Branke (Warwick)
Simulated Annealing is a powerful black-box optimisation heuristic that is successfully used in many application areas where exact algorithms are not applicable. In this talk we consider stochastic optimisation problems as they occur for example in simulation-based optimisation, where the system we would like to optimise is represented by a stochastic simulation model. There are two main challenges: First, because the evaluation of a solution is stochastic, even identifying the better of two solutions becomes difficult, let alone finding the optimal solution. In theory this can be mitigated by evaluating each solution multiple times and averaging out the noise, but because evaluating a solution is computationally expensive, one has to be very careful about how many samples one uses to estimate a solution’s quality. This talk reviews my work with Robin on tackling this problem.
14.45 - 15.15
Stochastic synchronization induced by noisy driving
Tom Witten (Chicago)
Soft matter and biological systems often execute stable cyclic behavior. An ensemble of such oscillators can be synchronized by an external periodic perturbation. Recent studies of neural spike trains and forced colloidal particles show a less obvious form of synchronization using randomly timed, nonperiodic pulses applied to the whole ensemble. We exhibit a regime of driving that creates a generalized form of phase ordering of the ensemble that we call stochastic synchronization. This incomplete synchronization is expected to occur generally for ensembles of identical, noninteracting elements in periodic motion. We account for the approach to arbitrarily small phase disorder as the driving strength approaches a threshold.
Round table discussion and closing remarks