Edoardo Gabriele Barp
Academic Work and Achievements
A numerical study of the 3D random interchange and random loop models [IOPScience] [Arxiv]
Hover to see more
My main role in the publication was developping in C the environment which ran all the simulations from used for the empirical analysis.
BSc Thesis: Manifold learning applied to topology preservation [PDF]
Hover to see more
During this project, we reviewed methods part of the field of non-linear dimensionality reduction, more precisely we looked at a set of methods called manifold learning , such as Locally Linear Embedding , Isomap , Laplace Eigenmap
CS342 - Machine Learning Report [PDF]
Hover to see more
Built machine learning model capable of classifying pictures of fishes from CCTV-like cameras on-board fishing boats. My final model used an array of methods, which included pre-processing, NLDR, and neural network as final classifier.
CS918 Natural Language Processing Revision Summary [PDF]
Hover to see more
Summary notes that I wrote to prepare for the examination.
CS255 - Artificial Intelligence Report [PDF]
Hover to see more
The aim was to build an AI for tanks in a 2D environment. Ais were assessed by ingenuity and battlefield performance. I decided to try and use a Genetic Algorithm to evolve an intelligent tank.
Genetic Algorithm: A Survey [PDF]
Hover to see more
As part of the BSc in physics, I had to write a review article on a topic of choice.
Prize for best Philosophy student
Hover to see more
Won the prize for the best student in Philosophy in my year at the European School of Luxembourg.
Won Warwick 1st Edition Hackhathon
Hover to see more
Wrote a software which allowed users to build structures in a 3D environment, starting with base shapes such as spheres, cubes and other polygons. The whole environment was controlled purely through hand-gestures, and it allowed for virtual reality usage, to fully immerse user in enrivonment and visualise structure.
Terrible Digit Recognition ANN widget!
The neural network was trained separately using Keras with Tensorflow backend. Its structure consists of 3 hidden layers using respectively Relu, Tanh and Tanh as activation, and finally softmax for the output. It was trained on the notorious MNIST dataset for 50 epochs using sgd as optimizer, cross entropy as loss and dropout layers in between every hidden layers.
It attains an accuracy of 95% and loss of 0.15 on the testing data.
It's interesting to see that, even though the network has very good results on the MNIST dataset, it seems to easily get numbers confused in here. Neural networks (especially such small ones) will try to fit in a much stricter way, which in many cases (as you can try for yourself below) will make high jump in prediction for rather small strokes, and often strokes that to a human would not matter (e.g. slightly longer/smaller length of 1).
In theory, this could be solved by adding many more layers in an attempt to gain "abstraction" and therefore learn "patterns" instead of similarity, but new issues would arise from this such as training time, data requirements and overfitting. This is where CNN excels, thanks to their very different structures.
I hope you enjoy the little fun this may bring to your life.
EDIT: Unfortunately, the ITS changed something about the website, and now the canvas does not work anymore, so you can admire a nice table of 0.000 probabilities instead.
0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
---|---|---|---|---|---|---|---|---|---|
0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 |