# Research

Approximate Bayesian Computation (ABC). Having observed data $x 0$ (the grey dot), we sample parameter values $theta$ from the prior and generate observations through the model simulator $simulator$, that are then accepted (green) or rejected (red) according to their distance from the observation measured by $distance between s(x1) and s(x2)$ on the summary statistics space

Our Contribution in the domain of ABC concentrates on,

• Learning summary statistics from the data using a neural network or techniques from metric learning.
• Learning the distance between summary statistics using classification rule.
• Application of ABC in multiple domains of natural science - computational biology, environmental science, molecular dynamics, passenger dynamics, modelling of epidemics, meteorology.
• Development of software ecosystem for methods for ABC and related likelihood-free inference: ABCpy.

Scoring rule (SR): A general and robust framework for statistical inference. SRs are widely used by meteorologists to assess the predictive performance of probabilistic weather models; these work by considering the empirical distribution defined by the ensemble and by assessing the precision and dispersion of this distribution with respect to observations. More precisely, for a scoring rule $S$, $S(P,x)$ represents the penalty given to a probabilistic forecast $P$ for an observation $x$. If $x$ is drawn from a different distribution $Q$, the expected scoring rule is $defn of SR$. A proper scoring rule is such that $S(P, Q)$ is minimised in $P$ when $P = Q$. Moreover, $S$ is strictly proper if this is the unique minimum; therefore, strictly proper scoring rules encourage to provide predictions close to the data-generating process.

Our Contribution,

• Frequentist framework to train generative networks using minimum scoring rules
• Bayesian framework to perform generalized likelihood-free Bayesian inference using scoring rules estimators
• Inference of amortized posterior distribution using scoring rules
• Applications to weather forecasting

Probabilistic downscaling under climate change scenario. The weather/climate variables (eg. temperature, air velocity, air pressure etc.) can be seen as a vector valued variable $W$, evolving over three-dimensional space $x$ and time $t$. Due to the significant cost of global climate model (GCM) simulations, climate predictions by GCMs are done using a coarse-resolution grid. Given these coarse-grained predictions of W only occur at n-many grid points of the coarse-grid on Earth’s surface by GCMs at a time-point, to assess the effects at a regional level the values of W need to be predicted at a finer spatial resolution based on some vector-valued predictor function p(x) (eg. topology, tree cover, water body etc.) and the $W$ predicted by GCMs on the coarse grid points.
Our Contribution,
• Prediction of rainfall (Y) at fine grid points given GCM predicted weather/climate variables on a coarse grid.
• To do so we have developed neural network based models and Bayesian statistical spate-temporal models for probabilistic modelling of rainfall.
• In future, our research targets the crucial task of predicting and assessing the impact of extreme weather events with return times of 100 years or more, with regional accuracy to give civil authorities the best opportunity to prepare emergency response plans.

Using ABCpy and techniques developed by us for approximate Bayesian computation you can infer approximate posterior distribution of parameters of models with intractable likelihood function.