During my Creative Automata course in grad school, we explored digital signals and automata theory. For the final project, we needed to create something that demonstrated everything we’d learned throughout the semester. I decided to push the concept to its extreme: what if you could create music using nothing but logic gates?

(Note: I’ve reconstructed this from pictures and files I had saved - unfortunately, I no longer have the original project files.)

Introduction

At its core, playing music is executing a sequence of instructions at specific times. Press this key now, hold that note for two beats, rest for one measure. It’s fundamentally algorithmic. Computers, at their lowest level, are just logic gates processing signals - yet they can execute incredibly complex instruction sets. This got me thinking: could I design a system that plays music using only logic gates and signals?

The concept was to map input pulses directly to musical notes. I had a guitar readily available, so I started there, but the principle should translate to any instrument - we could theoretically even design new instruments optimized specifically for logic gate control.

Development and Challenges

The initial development was straightforward, but I quickly discovered several factors I hadn’t accounted for in early planning:

Editor Limitations: I chose Unity as the development platform since it’s highly customizable, but building a circuit editor from scratch meant doing most of the heavy lifting myself. Unity provided the canvas, but I had to create all the circuit-specific functionality.

Save/Load System: Embarrassingly, I didn’t realize until late in development that users would need to save and load their circuit designs. This fundamental feature didn’t make it into the initial release - a classic case of being too focused on the core mechanics to think about basic usability.

Timing System Constraints: The pulse node timing system I designed was overly simplistic. It could only handle basic patterns with regular intervals, severely limiting the complexity of music that could be played. Only repetitive beats worked well - anything with varied timing or syncopation was effectively impossible.

How It Works

The system uses a visual circuit editor where each black node generates pulses. When a pulse fires, it’s interpreted as a TRUE signal; between pulses, the signal is FALSE. Each node has a configurable start delay and repeat interval. Users build their circuit, press the “Start” button, and the timer system activates, sending pulses at the appropriate times to trigger notes.

Designing “Smoke on the Water”

For my benchmark test, I wanted to recreate the iconic opening riff of “Smoke on the Water” by Deep Purple. Since no documented approach existed for this kind of system, finding an optimal circuit design required significant experimentation.

The naive approach - one node per note - would work but would create massive, unwieldy circuits. A more elegant solution emerged: use four pulse nodes like four fingers playing a guitar, mapping to the four notes needed for the riff.

The timing breakdown for the chorus (notes: A, C, D – A, C, D#, D – A, C, D, C, A):

  • A appears at beats 1, 4, 7, 11 → Offset: 1, Delay: 3
  • C appears at beats 2, 5, 8, 10 → Offset: 2, Delay: 3 (with special handling for the final note)
  • D appears at beats 3, 6, 9 → Offset: 3, Delay: 3 (with an exception for the final note)
  • D# appears at beat 5.5 → Offset: 5.5, Delay: 11 (occurs between regular beats)

This approach worked, but the edge cases and special handling required highlighted the limitations of my simple timing system.

Technical Implementation

This project used a very early version of Unity where the 2D framework was still in its infancy. The Line Renderer component didn’t support 2D projects yet, so I had to implement custom OpenGL code to draw connections between circuit nodes. Given two points (x1, y1) and (x2, y2), I needed to render lines connecting them - something that would be trivial in modern Unity but required low-level graphics programming back then.

Software used:

  • Unity 3D
  • C#
  • Photoshop
  • Audacity

Future Development Ideas

Looking back, several extensions could significantly expand the system’s capabilities:

Multi-Instrument Support: The circuit is currently mapped to guitar, but it would be fascinating to explore how circuit designs would differ across instruments. A piano circuit might look completely different from a drum kit circuit, and the optimal logic gate arrangements could vary dramatically.

Enhanced Timing System: The biggest limitation was the simplistic offset-and-delay timing model. A more sophisticated approach would use a configuration file that explicitly defines when each node fires, rather than relying on periodic intervals. This would unlock the ability to play complex rhythms and varied musical patterns, taking the system from a proof-of-concept to a genuinely useful musical tool.