Coronavirus (Covid-19): Latest updates and information
Skip to main content Skip to navigation

Juan Ungredda

About me

I am a posgraduate researcher working on Bayesian optimisation methods in collaboration with ESTECO, a software provider highly specialised in numerical optimisation for engineering design problems. I am supervised by Prof. Jürgen Branke in Operations research in the Warwick Businesss School. I studied Industrial Engineering at Universidad Católica Andrés Bello and obtained a Msc degree in Mathematics of Systems (with Distinction) from University of Warwick.

My current work involves real-world optimisation problems which are often very expensive (i.e. money, time) to evaluate. Therefore, it is of great significance to efficiently collect data in order to find the best design based on a small number of simulation runs. This is achieved by Gaussian Processes as surrogate models for model based optimisation. In academia, this method has impacted a wide range of areas, including interactive user interfaces, robotics, environmental monitoring, sensor networks, etc.

Email: J.Ungredda@warwick.ac.uk

Phone: +44 07716 688233

Github: https://github.com/JuanUngredda

Conferences, Workshops and Activities

1. IMA and OR Society Conference on Mathematics of Operational Research. Speaker in "Bayesian Optimisation with Input Uncertainty Reduction" (slides)

2. Wednesday 12mm, come join the Bayesian optimisation and Machine Learning reading group and speaker series at Mathematics of Systems department. Currently collaborating.

3. 3 months internship period April-June, 2019 at ESTECO in Trieste, Italy.

    My PhD Work

    Simulation optimisation, i.e., the search for a design or solution that optimises some output value of the simulation
    model, allows to automate the design of complex systems and has many real-world applications. Yet, several
    difficulties arise when dealing with real systems, specially long simulation running times, and stochastic outputs.
    Also, stochastic simulations take in probabilistic assumptions, through system logic, to produce random outputs
    that must be estimated. Therefore, when constructing the simulation model, the decision maker often faces the
    challenge of defining input distributions (eg. the mean of an arrival time distribution), in particular, if multiple
    candidate distributions can fit the input data reasonably well, performance analysis are subjected to input error,
    variability or uncertainty \mathbb{P}[A|Data].

    Moreover, if both, running additional simulations to learn about the output landscape \mu(X,A), and collecting more data to reduce the input uncertainty \mathbb{P}[A|Data] are expensive, then it is important to evaluate the trade-off between them since devoting too much effort to data collection (left image) may not leave sufficient time for optimisation, while devoting too little effort to data collection will require us to search for a robust solution that performs well across the possible input distribution, but may not be best for the true input parameters (right image).

    Reduce input uncertainty by collecting more source data Collecting more data to update surrogate model

    Msc Group Project

    This project was in collaboration with ESTECO to extend their software to also work well with stochastic simulators (such as discrete event simulations used in social science or epidemiology). The main challenges in this domain are

    1. There are often multiple optimisation criteria. In such cases, it is usually not possible to find one solution best according to all criteria, but the algorithm is supposed to search for a set of so-called “Pareto-optimal” solutions, with different trade-offs. The goal is to generate a representative subset of such solutions for a decision maker to choose from. Evolutionary algorithms are particularly suitable for this task, as they work with a population of solutions and can generate a set of Pareto optimal solutions in one run.

    2. Simulations are computationally expensive, so only few solutions can be evaluated. To reduce the number of simulations, surrogate models can be constructed and used to approximate the solution quality.

    This poses a challenge to optimisation, as a solution’s objective function values are now stochastic, making it difficult to decide whether an observed quality difference is genuine or simply due to random noise. The goal is thus to develop novel strategies to handle noisy multi-objective simulation optimisation problems.

    Application: The problem has applications in many areas, including traffic light control, epidemiology, scheduling or call centres, to name just a few. In this study group, we suggest to work on a Supply Chain Management problem in the Chemical Process Industry.

    Report

    Final BEng Project at Universidad Católica Andrés Bello, Venezuela.

    Final project.  Propuesta De Mejoras Para Los Procesos Productivos De Una Empresa Productora De Gabinetes De Uso Doméstico, En Venezuela.

    Abstract: This work is an application of discrete event simulation in order to increase the utilization of production capacity in a cabinet factory. There were frequent delays due to incomplete inputs to the process and inadequate production programming methods which caused high levels of work in process and unmet delivery dates. Process improvement was achieved by reducing process variability. Possible improvements were generated with a mathematical programming model and its usefulness was verified by simulating the improved process. Results showed that mean capacity was increased 74% following new production scheduling rules; system variability was reduced, demand was met with no increase in capital investment or workforce.

    Published in Tekhne Vol. 20, Núm. 1 (2017)