Skip to main content Skip to navigation

The History of Analogue Computing

It has become commonplace to apply the terms analogue and digital to various technologies. During the twentieth century, this classification was applied to computing technology and created two very separate technical cultures.

From the perspective of twenty-first century technology, analogue computing is often assumed to be a technology with a continuous representation of state. However, evidence suggests that there is an older understanding of analogue computing which makes more direct use of the term's common root with analogy.

This project is investigating the emergence of the distinction between the technologies that we now know as analogue and digital, and how the computing community's understanding of this distinction developed over time. The research should provide a deeper understanding of what it means to compute by analogy and begin to explore how analogue views on the nature of computation relate to the more algorithmic definition common in Computer Science.

Investigator: Charles Care (PhD thesis)