Web Audio API · Interactive Course

Sound
From Scratch

Build a synthesizer in the browser, piece by piece. Each chapter adds one concept — and one working instrument you can play right now. No libraries. No frameworks. Just the browser.

AudioContext OscillatorNode GainNode ADSR Envelopes BiquadFilterNode DelayNode AnalyserNode
A Note From Me

Why The Browser Wins

I'm a producer first. I've spent years making music in DAWs — shaping envelopes, automating filter sweeps, designing kicks from the oscillator up. So when I started getting serious about building my own audio tools as a developer, the natural instinct was to reach for the "proper" solution: JUCE, the C++ framework that powers most serious audio software. I'm going to save you some time — it's an abominable mess. You spend three hours configuring a build system before you hear a single sample. The documentation assumes you already have a PhD in signal processing. And when something breaks, you're debugging across five abstraction layers with no feedback loop whatsoever. It's not that JUCE is bad — it's that it was built for a world where the only people writing audio software were audio engineers with decades of experience. That world doesn't need to exist anymore.

The Web Audio API changed my thinking completely. It lives in the browser — the most widely distributed runtime on the planet. The latency is surprisingly good. The node graph model maps almost one-to-one onto the kind of signal routing any producer already understands intuitively. You write a function, you hear the result, you tweak it, you hear it again. The feedback loop is instant. More importantly, what you build is immediately deployable and immediately accessible to anyone with a browser — no downloads, no installs, no platform targets. That's not a compromise. That's a superpower. This is what Fenelite Pro runs on. This is what my web audio tools are built with. And this is what I'm going to walk you through — from the first oscillator all the way to a full synthesizer with filter, envelope, delay and keyboard, built entirely in the browser.

Each chapter here adds exactly one concept and one demo you can play with immediately. By the end, you'll understand the full signal chain — and you'll have built it with your own hands. No library required.

Chapters
01

The AudioContext

What is it?

The browser has a built-in audio engine called the Web Audio API. Before you can make any sound, you need to create an AudioContext — think of it as your DAW project. It controls the clock, the output routing, and the sample rate.

From the context, you create audio nodes — generators, processors, analyzers — and wire them together like patch cables in a modular synth. The signal flows from source → processing → destination.

Browsers require a user gesture before allowing audio. A button click, keypress, or touch all work. Trying to create an AudioContext before any interaction will leave it in a suspended state.

// Create the context (once, globally)
const ctx = new AudioContext();

// Create an oscillator node
const osc = ctx.createOscillator();

// Create a gain node (volume control)
const gain = ctx.createGain();
gain.gain.setValueAtTime(0.3, ctx.currentTime);

// Wire them up: osc → gain → speakers
osc.connect(gain);
gain.connect(ctx.destination);

// Start and stop
osc.start();
osc.stop(ctx.currentTime + 1.5);
OscillatorNode GainNode AudioDestinationNode
demo — first sound
idle

Hit the button. That's a 440Hz sine wave — the A above middle C.

02

Pitch & Frequency

Hz = Pitch

Frequency is measured in Hertz (Hz) — cycles per second. The higher the frequency, the higher the pitch. Middle A is 440Hz. An octave up is exactly double: 880Hz. An octave down is half: 220Hz.

The Web Audio API gives you full control over an oscillator's frequency parameter in real time — including smooth glides and precise scheduling.

const osc = ctx.createOscillator();

// Set frequency directly
osc.frequency.setValueAtTime(440, ctx.currentTime);

// Or smoothly glide to a new pitch over 0.1s
osc.frequency.linearRampToValueAtTime(880, ctx.currentTime + 0.1);
demo — pitch playground
440
A4
Frequency 440 Hz
03

Waveforms

Shape = Timbre

A waveform describes how air pressure changes over time. Different shapes produce different timbres — the texture of a sound. The oscillator's type property controls this.

Sine — pure fundamental only, smooth and soft. Square — hollow and buzzy, odd harmonics only. Sawtooth — bright and harsh, all harmonics. Triangle — between sine and square, quieter odd harmonics.

osc.type = 'sine';      // smooth, pure
osc.type = 'square';    // hollow, buzzy
osc.type = 'sawtooth';  // bright, harsh
osc.type = 'triangle';  // soft, muted
demo — waveforms + oscilloscope
04

Gain & ADSR Envelopes

Shaping Volume Over Time

A GainNode multiplies the amplitude of any signal flowing through it. But the real power is in automating that gain — scheduling precise changes over time to sculpt how a sound evolves.

ADSR (Attack, Decay, Sustain, Release) is the classic envelope shape. It's what makes a plucked string sound different from a held pad, even at the same pitch. You apply it by scheduling gain ramps around a note's trigger and release.

function triggerNote(freq, { attack, decay, sustain, release }) {
  const osc  = ctx.createOscillator();
  const env  = ctx.createGain();
  const now  = ctx.currentTime;

  osc.connect(env).connect(ctx.destination);
  osc.frequency.value = freq;

  // ADSR curve on the gain parameter
  env.gain.setValueAtTime(0, now);
  env.gain.linearRampToValueAtTime(0.8, now + attack);
  env.gain.linearRampToValueAtTime(sustain, now + attack + decay);

  osc.start(now);

  // On note-off:
  env.gain.linearRampToValueAtTime(0, now + attack + decay + release);
  osc.stop(now + attack + decay + release + 0.05);
}
demo — adsr envelope
Attack 0.01s
Decay 0.1s
Sustain 0.6
Release 0.3s
05

The Keyboard

Notes are just frequencies

Every piano key maps to a specific frequency via the equal temperament formula. MIDI note 69 is A4 (440Hz). Every semitone multiplies by the twelfth root of two (~1.0595).

// MIDI note number → frequency
function midiToFreq(midi) {
  return 440 * Math.pow(2, (midi - 69) / 12);
}

midiToFreq(60); // → 261.63  (middle C)
midiToFreq(69); // → 440.00  (A4)
midiToFreq(81); // → 880.00  (A5)

Keyboard mapping: use the bottom row of your keyboard (A S D F G H J K L ;) for white keys starting at C4, and the top row (W E T Y U O P) for black keys.

demo — play me
idle — click a key or use your keyboard
06

Filters

Sculpting the Frequency Spectrum

A BiquadFilterNode removes or boosts specific frequency ranges. A lowpass filter cuts everything above the cutoff frequency — turning a harsh sawtooth into something warm and rounded.

The Q (resonance) parameter boosts frequencies right at the cutoff, creating that classic synth sweep sound. High Q values produce a sharp, whistling resonance peak.

const filter = ctx.createBiquadFilter();
filter.type = 'lowpass';
filter.frequency.value = 800;  // cutoff in Hz
filter.Q.value = 5;            // resonance

// Insert between oscillator and gain:
osc.connect(filter);
filter.connect(gain);
gain.connect(ctx.destination);
OscillatorNode BiquadFilterNode GainNode AudioDestinationNode
demo — filtered keyboard
Cutoff 1200 Hz
Resonance (Q) 1.0
07

The Full Patch

Everything connected

Now we wire it all together. Oscillator → Filter → Envelope → Delay → Output. Each node you've learned becomes a module in the chain. This is fundamentally how every synthesizer — hardware or software — is built.

OscillatorNode BiquadFilterNode GainNode (env) DelayNode Destination
demo — full synth
Oscillator
Filter
Cutoff 2000
Resonance 1.0
Envelope
A 0.01s
D 0.1s
S 0.6
R 0.3s
Delay
Time 0.3s
Feedback 0.3
Mix 0.3
↑ All Lessons NEXT →01 — Beat Anatomy