Skip to main content

The Microscope Sings / Text to Music

 

This is an ongoing project aimed at making music directly from other data or information. The latest phase of the project (Text to Music) is based on work by three University of Warwick Physics students, Alex Simpson, John Bamping and Joe Marsh Rossney. The explored the generation of music directly from text, using phonemes (the basic sounds of a natural language) as an intermediary. We think this approach has never been tried before. This project was supported by IATL via the Science of Music interdisciplinary module. Some samples of work produced in this phase of the project are available on Soundcloud.

The first phase of the project (The Microscope Sings) was based on work carried out over two summers by three University of Warwick students and supervised by Dr. Gavin Bell. It has been generously supported by the university's URSS scheme, the Nuffield Foundation and the Department of Physics. We explored the generation of music from images. The project has yielded a framework for the sonification process, named SAMI, and a wide variety of scripts which can be used within the framework to generate music.

What does "sonify" mean? What is "sonification"?

Sonification is the process of turning data into a non-speech audible signal to help convey information or process a data stream. In this project we have concentrated on turning images into music. it is possible to sonify any data set, as has been done by several other projects. For instance, NASA used sonification to study the solar wind. There is a lot of interest in algorithmic composition, i.e. the production of music without human intervention. For example, one can use evolutionary algorithms to mutate a random sequence towards a particular style of music: Melomics is a nice example. Of course, the "evolutionary fitness" of generations is defined against a human standard (genre patterns).

What is SAMI?

SAMI Ain't a Musical Instrument. SAMI is a framework for sonification, written by Jack Dobinson and Simon StJohn-Green in the summer of 2010. It allows an image file to be loaded and facilitates the passing of variables between python scripts. A selection of python scripts for the purpose of sonification have been written to be used with SAMI.

Who are we?

The concept for the project was thought up by Dr. Gavin Bell. Jack Dobinson and Simon StJohn-Green started the project in the summer of 2010, creating SAMI and a number of python scripts. The project was then continued by Robert Wilson the following summer, during which the focus was solely on writing new scripts.

The SAMI Team!


Downloads:

SAMI (Sami Ain't a Musical Instrument), our image sonification framework. This one requires a bit of effort to compile, but full instructions and explanation is included in the README.

(SAMI uses the MIB Ossigeno KDE Icon Theme and Qt, as such it is released under the GPL, python scripts require mxm's midi library)