This page is Almost Ready

Notice: The WebPlatform project, supported by various stewards between 2012 and 2015, has been discontinued. This site is now available on github.

Web Audio API

Summary

Describes a high-level JavaScript API for processing and synthesizing audio in web applications.

AudioBuffer
This interface represents a memory-resident audio asset, primarily for one-shot sounds and other short audio clips. Its format is non-interleaved IEEE 32-bit linear PCM with a nominal range of -1 -> +1. It can contain one or more channels.
AudioListener
This interface represents the position and orientation of the person listening to the audio scene. All PannerNode objects spatialize in relation to the AudioContext's listener.
ChannelMergerNode
Represents an AudioNode for combining channels from multiple audio streams into a single audio stream. Often used in conjunction with ChannelSplitterNode.
ChannelSplitterNode
Represents an AudioNode for accessing the individual channels of an audio stream in the routing graph. Often used in conjunction with ChannelMergerNode.
ConvolverNode
This interface represents a processing node which applies a linear convolution effect given an impulse response.
GainNode
Changing the gain of an audio signal is a fundamental operation in audio applications. The GainNode is one of the building blocks for creating mixers. This interface is an AudioNode with a single input and single output, which multiplies the input audio signal by the (possibly time-varying) gain attribute, copying the result to the output.
OscillatorNode
OscillatorNode represents an audio source generating a periodic waveform. It can be set to a few commonly used waveforms. Additionally, it can be set to an arbitrary periodic waveform through the use of a WaveTable object. Oscillators are common foundational building blocks in audio synthesis.
WaveTable
WaveTable represents an arbitrary periodic waveform to be used with an OscillatorNode.

Usage

 This specification describes a high-level JavaScript API for processing and synthesizing audio in web applications. The primary paradigm is of an audio routing graph, where a number of AudioNode objects are connected together to define the overall audio rendering. The actual processing will primarily take place in the underlying implementation (typically optimized Assembly / C / C++ code), but direct JavaScript processing and synthesis is also supported.

This API is designed to be used in conjunction with other APIs and elements on the web platform, notably: XMLHttpRequest (using the responseType and response attributes). For games and interactive applications, it is anticipated to be used with the canvas 2D and WebGL 3D graphics APIs.

See also

Related articles

Audio