No description
Find a file
2020-07-19 13:17:20 -07:00
.github adding link to autoplay wiki 2020-04-16 10:30:56 -04:00
examples correcting link 2020-07-19 13:14:30 -07:00
scripts checking against 'next' version instead of 'typescript' 2020-01-19 23:40:12 -05:00
test splitting the examples into two sections to speed things up 2020-07-19 12:41:20 -07:00
Tone removing RecursiveOmit (never used) 2020-07-19 12:23:18 -07:00
.eslintrc.js linting html 2020-07-19 11:46:01 -07:00
.gitignore local version should always be 'dev' 2019-07-30 16:05:12 -04:00
.travis.yml splitting the examples into two sections to speed things up 2020-07-19 12:41:20 -07:00
CHANGELOG.md updating changelog 2020-06-25 08:25:10 -07:00
LICENSE.md updating license year 2020-01-05 17:59:36 -05:00
package-lock.json adding html linting (for examlpes) 2020-07-19 10:58:08 -07:00
package.json adding html linting (for examlpes) 2020-07-19 10:58:08 -07:00
README.md updating to latest API 2020-07-19 13:17:20 -07:00
tsconfig.json feat: es6 output 2020-04-29 14:39:40 -04:00
webpack.config.js linting 2020-05-07 09:26:49 -04:00

Tone.js

Build Status codecov

Tone.js is a Web Audio framework for creating interactive music in the browser. The architecture of Tone.js aims to be familiar to both musicians and audio programmers looking to create web-based audio applications. On the high-level, Tone offers common DAW (digital audio workstation) features like a global transport for scheduling events and prebuilt synths and effects. For signal-processing programmers (coming from languages like Max/MSP), Tone provides a wealth of high-performance building blocks to create your own synthesizers, effects, and complex control signals.

API

Examples

Demos

Installation

  • download
  • npm install tone
  • dev -> npm install tone@next

importing

You can import the entire library:

import * as Tone from "tone";

or individual modules:

import { Synth } from "tone";

Hello Tone

//create a synth and connect it to the master output (your speakers)
const synth = new Tone.Synth().toDestination();

//play a middle 'C' for the duration of an 8th note
synth.triggerAttackRelease("C4", "8n");

Tone.Synth

Tone.Synth is a basic synthesizer with a single oscillator and an ADSR envelope.

triggerAttackRelease

triggerAttackRelease is a combination of two methods: triggerAttack when the amplitude is rising (for example from a 'key down' or 'note on' event), and triggerRelease is when the amplitude is going back to 0 ('key up' / 'note off').

The first argument to triggerAttackRelease is the frequency which can either be a number (like 440) or as "pitch-octave" notation (like "D#2"). The second argument is the duration that the note is held. This value can either be in seconds, or as a tempo-relative value. The third (optional) argument of triggerAttackRelease is when along the AudioContext time the note should play. It can be used to schedule events in the future.

Time

Tone.js abstracts away the AudioContext time. Instead of defining all values in seconds, any method which takes time as an argument can accept a number or a string. For example "4n" is a quarter-note, "8t" is an eighth-note triplet, and "1m" is one measure.

Read about Time encodings.

Starting Audio

Browsers will not play any audio until a user clicks something (like a play button) and the AudioContext has had a chance to start. Run your Tone.js code only after calling Tone.start() from a event listener which is triggered by a user action such as "click" or "keydown".

Tone.start returns a promise, the audio will be ready only after that promise is resolved. Scheduling or playing audio before the AudioContext is running will result in silence or wrong scheduling.

//attach a click listener to a play button
document.querySelector('button').addEventListener('click', async () => {
	await Tone.start()
	console.log('audio is ready')
})

Scheduling

Transport

Tone.Transport is the master timekeeper, allowing for application-wide synchronization and scheduling of sources, signals and events along a shared timeline. Time expressions (like the ones above) are evaluated against the Transport's BPM which can be set like this: Tone.Transport.bpm.value = 120.

Loops

Tone.js provides higher-level abstractions for scheduling events. Tone.Loop is a simple way to create a looped callback that can be scheduled to start and stop.

const synth = new Tone.Synth().toDestination();
//play a note every quarter-note
const loop = new Tone.Loop(time => {
	synth.triggerAttackRelease("C2", "8n", time);
}, "4n");

Since Javascript callbacks are not precisely timed, the sample-accurate time of the event is passed into the callback function. Use this time value to schedule the events.

You can then start and stop the loop along the Transport's timeline.

//loop between the first and fourth measures of the Transport's timeline
loop.start("1m").stop("4m");

Then start the Transport to hear the loop:

Tone.Transport.start();

Read about Tone.js' Event classes and scheduling events with the Transport.

Instruments

//pass in some initial values for the filter and filter envelope
const synth = new Tone.Synth({
	oscillator : {
		type : "pwm",
		modulationFrequency : 0.2
	},
	envelope : {
		attack : 0.02,
		decay : 0.1,
		sustain : 0.2,
		release : 0.9,
	}
}).toDestination();

//start the note "D3" one second from now
synth.triggerAttack("D3", "+1");

All instruments are monophonic (one voice) but can be made polyphonic when the constructor is passed in as the first argument to Tone.PolySynth.

//a 4 voice Synth
const polySynth = new Tone.PolySynth(Tone.Synth).toDestination();
//play a chord
polySynth.triggerAttackRelease(["C4", "E4", "G4", "B4"], "2n");

Read more about Instruments.

Effects

In the above examples, the synthesizer was always connected directly to the speakers, but the output of the synth could also be routed through one (or more) effects before going to the speakers.

//create a distortion effect
const distortion = new Tone.Distortion(0.4).toDestination();
//connect a synth to the distortion
synth.connect(distortion);

Read more about Effects

Sources

Tone has a few basic audio sources like Tone.Oscillator which has sine, square, triangle, and sawtooth waveforms, a buffer player (Tone.Player), a noise generator (Tone.Noise), a few additional oscillator types (pwm, pulse, fat, fm) and external audio input (when WebRTC is supported).

//a pwm oscillator which is connected to the speaker and started right away
const pwm = new Tone.PWMOscillator("Bb3").toDestination().start();

Read more

Signals

Like the underlying Web Audio API, Tone.js is built with audio-rate signal control over nearly everything. This is a powerful feature which allows for sample-accurate synchronization and scheduling of parameters.

Read more.

AudioContext

Tone.js creates an AudioContext when it loads and shims it for maximum browser compatibility using standardized-audio-context. The AudioContext can be accessed at Tone.context. Or set your own AudioContext using Tone.setContext(audioContext).

MIDI

To use MIDI files, you'll first need to convert them into a JSON format which Tone.js can understand using Midi.

Performance

Tone.js makes extensive use of the native Web Audio Nodes such as the GainNode and WaveShaperNode for all signal processing, which enables Tone.js to work well on both desktop and mobile browsers.

This wiki article has some suggestions related to performance for best practices.

Testing

Tone.js runs an extensive test suite using mocha and chai with nearly 100% coverage. Each commit and pull request is run on Travis-CI across browsers and versions. Passing builds on the 'dev' branch are published on npm as tone@next.

Contributing

There are many ways to contribute to Tone.js. Check out this wiki if you're interested.

If you have questions (or answers) that are not necessarily bugs/issues, please post them to the forum.

References and Inspiration