Virtual Baby Feeding
Interactive virtual environments are becoming increasingly important in science, industry and the health field. In many cases watching a video or listening to instruction is not enough to enhance a person's learning and it is necessary to have a good medium in which to interact with the subject material. Feeding problems and the associated crying of the infant are a major cause of anxiety for parents. Dealing with this anxiety is estimated to cost the NHS approximately £70 million a year. Feeding problems may be symptoms of difficulties in caregiver–infant relationships. Understanding a child’s response to food is a key part of effective feeding and it is a central issue in early childhood development. Therefore this project, in collaboration with the Department of Psychology, worked to observe the interaction between mother and child and to provide an effective immersive experience to introduce a novel solution within a high-fidelity virtual environment for interactive therapy; accompanied by several important attributes, which are aimed at stimulating the human senses such as vision and hearing, which may affect the quality of interaction during feeding.
Urban design and human wellbeing
Partners: Steve Walker, Arup
Funders: Arup, EPSRC
This project developed high-fidelity multi-modal virtual environments of built urban environments to investigate the role of individual design features in perceived comfort levels in these built environments.
Interactive highly realistic virtual reality for analysis and management of paranoid thinking
Partners: Professor Swaran Singh, WMS Dr Matthew Broome, WMS Professor Max Birchwood, University of Birmingham
This project combined the disciplines of Visualisation and Cognitive Psychology in order to undertake a detailed feasibility study into how a VR-CBT (Virtual Reality Cognitive Behaviour Therapy) system may be developed to make it eminently suitable for the effective treatment of schizophrenia.
A Comparative Study of the High-Fidelity Computer Reconstruction of Byzantine Art in Cyprus in the Past and Present
Partners: Dr Mark Horton, University of Bristol Brightside
Computer reconstructions of heritage sites provide us with a means of visualising past environments, allowing us a glimpse of the past that might otherwise be difficult to appreciate. However, it is essential that these reconstructions incorporate all the physical evidence for a site, otherwise there is a very real danger of misrepresenting the past. The goal of this project was to determine whether there is indeed a significant difference in the way in which people view Byzantine art today, and as it may have appeared in the past as they were displayed in their original environments and were illuminated by candle light, oil lamps and day light. The results from this project provided new insights into how Byzantine art may have been viewed in the past and provide guidelines for future high-fidelity computer reconstructions of cultural heritage artefacts.
Perceptually Realistic Environments for Architectural Planning and Visual Impact Assessment
Partners: Professor Roger Hubbold, University of Manchester Napper Architects
Computer-generated high-fidelity augmented reality visualisations of proposed architectural developments can play a significant role in helping a viewer understand the visual impact on the environment, especially in sensitive locations. Such interactive visualisations have the potential to greatly improve both the initial design process, and the subsequent planning application and public consultation processes.
The project developed novel techniques in the areas of automatic reconstruction from wide-baseline images, image-based lighting, and real-time global illumination, and was scoped by the demands of a real architectural application with our industrial partner Napper Architects. Techniques were informed by the stringent fidelity requirements of a mixed-reality application suited to providing visual impact assessment for sensitive developments.
High Dynamic Range for High Fidelity Image Synthesis of Real Scenes
Partners: Dr Marina Bloj, University of Bradford DSTL, Insys, Dolby Canada
The computer graphics industry, and in particular those involved with films, games, simulation, virtual reality and military applications, continue to demand more realistic computer-generated images, that is computed images that more accurately match the real scene they are intended to represent. This is particularly challenging when considering images of the natural world, which presents our visual system with a wide range of colours and intensities. In most real scenes, for example, looking from inside a house towards a window, the ratio between the darkest areas (e.g. inside the room) and the brightest area (outside the window), the so-called contrast ratio, could be many thousands to one. This project used novel HDR displays to evaluate existing Tone Mapping Operators (TMOs) to see how well they preserved the appearance of the real scenes, and used the insights gained to develop new, more accurate TMOs for existing computer monitors and HDR displays. A framework was also produced that provided a straightforward, objective way of comparing real and synthetic images.
...towards Real Virtuality
Partners: Professor David Howard, University of York, Christopher Moir, WMG, Arup, IBM, SpheronVR
We rely on our senses to interact with the world around us. Do we actually need to travel to be somewhere to experience it fully? This project developed a "virtual cocoon" through which people interact naturally with the world without actually travelling or being put in that particular, potentially dangerous, real situation. All five senses are stimulated to provide a rich sensory "real virtuality" experience. A key feature behind this real virtuality project is the attention paid to the degree of naturalness perceived by the user in the virtual world. The virtual cocoon provides low-cost, high confidence, high quality multi-sensory knowledge directly to your current location.