Skip to main content Skip to navigation

The Making Of Money and Medals VR

A collaboration between the Digital Arts and Humanities LabLink opens in a new window, Classics DepartmentLink opens in a new window, and Warwick Museum Studies NetworkLink opens in a new window.

On 19th February 2026 the Digital Arts and Humanities Lab ran a successful VR activity with the Classics department, as part of a workshop for museum curators. This included two activities. In the main section, participants wore Quest headsets to explore an augmented reality exhibition of Roman coins relating to the Emperor Augustus, arranged in a timeline. It began with an intro video, and then for each coin we displayed obverse and reverse in 3D, along with a text label explaining the significance of details on the coins. The text could also be played as an audio narration. A view from one headset was displayed on a big screen in the room. This was situated in a roman-style room (by Alexander O'Toole). Unfortunately we couldn't fit all of the virtual room into the physical space, which was an unusual shape, but we could display the coins against a backdrop of two walls, and a roman floor.

A second activity accompanied this, in which coins and a vase could be manipulated collaboratively by two participants in VR headsets (moved, resized, annotated). The vase was set with bouncy physics, so that it could be dropped and would bounce. Two coins were set just with gravity, and were placed resting on a real physical table (mapped in the headset). A third coin was set to hover. This was also displayed on an iPad for observers to view.

Half an hour before the start, five students were given training in how to facilitate the exhibition, and did their jobs brilliantly.

This is a video of the first part of the exhibition. The video is started, then paused, before moving on to look at coins.

How this was made

The augmented reality exhibition was created and hosted in the app Figmin XR, running in Meta Quest Pro headsets. It will also run on Meta Quest 2, 3 and 3S headsets, on iPad and iPhone, and in Steam on Windows.

Figmin is used to create and arrange virtual artefacts overlaid onto a view of the real world, in 3D space. We can add images, videos, audio, web pages, and 3D objects. Sketching, sculpting and building in 3D is easy with the app. Objects can be locked into place, or made interactive. Objects that have been created or scanned and uploaded into the Sketchfab platform can be downloaded and added to the scene.

In this exhibition, coins were scanned and uploaded to Sketchfab by Madeline Robinson in Sydney. These very high resolution scans were then processed for use in Figmin by Robert using Mesh Lab 2 (reducing the number of polygons and smoothing holes). The backdrop with decorated walls and a marble floor was created in Blender by Alexander O'Toole. The text labels, video, and audio recordings used in the exhibition were created as web pages in this web siteLink opens in a new window, with JavaScript programming for interactivity by Robert and Alexander.

Videos can be included from YouTube. Web browser windows can be placed in the 3D space, meaning we can include interactive web content. In this exhibition the web pages have audio and video embedded, with buttons controlling the media through JavaScript.

The texts for the audio were written by Classics students into web pagesLink opens in a new window, and then converted into .mp3 audio files using the Eleven Labs text to speech AI. This is not perfect, and needs some fine tuning so that the AI interprets Latin words and Roman names correctly. The web pages are embedded into the Figmin virtual space as "web apps" - stand alone web page views that can be positioned anywhere. We resized each to portrait dimensions, and set text to small so as to fit the whole page in the view. Each window was enlarged so that the text would be easy to read from a distance. Play, pause, and restart buttons were embedded into the pages, and control logic written in JavaScript (with Alexander O'Toole). Participants could then control each media item themselves. Note that audio embedded into web apps is not spatial (meaning that the audio doesn't seem to play from the direction of the web app, and doesn't get louder when the participant moves towards it). If audio is embedded in YouTube videos, and added to the virtual space, it is spatial by default.

The exhibition was then built and replicated onto headsets. It can be "popped-up" in any physical space. The physical space we are using is a bit challenging, with an area to the back open to the whole building. We also had to alter the position of spot lights (with a big stick) to prevent glare. A closed and uncluttered space is best.

All obects were locked into place, and Figmin was set to kiosk-mode on the headsets, hiding all of the editing tools.

Just a couple of weeks before we started on this, the Figmin XR developer released a new scripting API, meaning that it is now possible to write code that runs in the 3D space to control interactions. We will be able to use this to add pop-up messages on hotspots on the coins. In the longer term we will be able to make much more sophisticated interactions, including with elements of gamification. During the making of the exhibition, Alexander O'Toole experimented with the new API, manipulating coins through coded interactions.

The "play area" part of the experience was also hosted in Figmin XR, with physics properties added to the 3D artefacts. Two headsets were used for this, and set to Edit mode. Participants also explored the sketching and annotation tools. In this case, we created a shared space in which both headsets operated, so that the participants could see each other's virtual avatars, and manipulate objects together. This required some trial and error experimentation to get the scene in the two headsets lined-up, so that the physical locations of the participants matched their virtual locations. The system may also be used with remote participants, so that people anywhere in the world can collaborate in a single virtual space.

Costs

Quest 3 headsets cost approximately £400. The cheaper but still adequate Quest 3S can be used, costing around £300. Headsets can often be bought for much less. We are using older Quest Pro headsets (cost £1500), which are not quite as good as the newer Quest 3.

Figmin XR costs £15 per headset, and £9 per iPad/iPhone. Professional developers often use the Windows/Steam version, for more precise designing.

Blender, Sketchfab, and Mesh Lab are free.

There are many free to use 3D models on Sketchfab.

Credits

Organised by Professor Clare Rowan, Murray Silk, Jemima Gurney, Emily Cornish, Margot Cowling, and Issy Harper.

With thanks to Dr Robert O'TooleLink opens in a new window and Alexander O'Toole for setting up the Virtual Roman-ity of this exhibition.

This is a Digital Arts and Humanities LabLink opens in a new window production.

The materials used for the wallsLink opens in a new window and floorLink opens in a new window were created by wolfgar74Link opens in a new window and downloaded from SketchfabLink opens in a new window. The roof textureLink opens in a new window was made by CATholicLink opens in a new window.

The Roman Army musicLink opens in a new window in the intro video was made by The Fealdo ProjectLink opens in a new window and downloaded from PixabayLink opens in a new window.

Let us know you agree to cookies