This page is Ready to Use

Notice: The WebPlatform project, supported by various stewards between 2012 and 2015, has been discontinued. This site is now available on github.

AudioContext

Summary

The AudioContext represents a set of AudioNode objects and their connections. It allows for arbitrary routing of signals to the AudioDestinationNode (what the user ultimately hears). Nodes are created from the context and are then connected together. In most use cases, only a single AudioContext is used per document.

Properties

activeSourceCount
The number of AudioBufferSourceNodes that are currently playing. Out of date; removed from spec. See http://webaudio.github.io/web-audio-api/.
currentTime
This is a time in seconds which starts at zero when the context is created and increases in real-time. All scheduled times are relative to it. This is not a transport time which can be started, paused, and re-positioned. It is always moving forward. A GarageBand-like timeline transport system can be very easily built on top of this (in JavaScript). This time corresponds to an ever-increasing hardware timestamp.
destination
An AudioDestinationNode with a single input representing the final destination for all audio (to be rendered to the audio hardware, i.e., speakers). All AudioNodes actively rendering audio will directly or indirectly connect to the destination node.
listener
An AudioListener, used for 3D spatialization.
sampleRate
The sample rate, in sample-frames per second, at which the AudioContext handles audio. It is assumed that all AudioNodes in the context run at this rate. In making this assumption, sample-rate converters or varispeed processors are not supported in real-time processing.

Methods

createAnalyser

Creates an AnalyserNode.

createBiquadFilter

Creates a BiquadFilterNode representing a second order filter which can be configured as one of several common filter types.

createBuffer

Creates an AudioBuffer of the given size. The audio data in the buffer will be zero-initialized (silent). An exception will be thrown if the numberOfChannels or sampleRate are out-of-bounds.

createBufferSource

Creates an AudioBufferSourceNode that can be used to play audio data contained within an AudioBuffer object…

createChannelMerger

Creates a ChannelMergerNode representing a channel merger. An exception will be thrown for invalid parameter values.

createChannelSplitter

Creates a ChannelSplitterNode representing a channel splitter. An exception will be thrown for invalid parameter values.

createConvolver

Creates a ConvolverNode, commonly used to add reverb to audio.

createDelay

Creates a DelayNode representing a variable delay line. Default delay is 0 seconds.

createDynamicsCompressor

Creates a DynamicsCompressorNode, used to apply compression to audio.

createGain

Creates a GainNode, used to control the volume of audio.

createMediaElementSource

Creates a MediaElementAudioSourceNode, given an HTMLMediaElement. As a consequence of calling this method, audio playback from the HTMLMediaElement will be re-routed into the processing graph of the AudioContext.

createMediaStreamSource

Creates a MediaStreamAudioSourceNode, given a MediaStream. As a consequence of calling this method, audio playback from the MediaStream will be re-routed into the processing graph of the AudioContext.

createOscillator

Creates an OscillatorNode, a source representing a periodic waveform. It basically generates a constant tone…

createPanner

Creates a PannerNode, used to spatialize an incoming audio stream in 3D space…

createScriptProcessor

Creates a ScriptProcessorNode for direct audio processing using JavaScript. An exception will be thrown if bufferSize or numberOfInputChannels or numberOfOutputChannels are outside the valid range.

createWaveShaper

Creates a WaveShaperNode, used to apply a distortion effect to audio.

createWaveTable

Creates a WaveTable representing a waveform containing arbitrary harmonic content. The real and imag parameters must be of type Float32Array of equal lengths greater than zero and less than or equal to 4096 or an exception will be thrown. These parameters specify the Fourier coefficients of a Fourier series representing the partials of a periodic waveform. The created WaveTable will be used with an OscillatorNode and will represent a normalized time-domain waveform having maximum absolute peak value of 1. Another way of saying this is that the generated waveform of an OscillatorNode will have maximum peak value at 0dBFS. Conveniently, this corresponds to the full-range of the signal values used by the Web Audio API. Because the WaveTable will be normalized on creation, the real and imag parameters represent relative values.

Out of date; removed from spec. See http://webaudio.github.io/web-audio-api/.

decodeAudioData

Asynchronously decodes the audio file data contained in the ArrayBuffer. The ArrayBuffer can, for example, be loaded from an XMLHttpRequest with the new responseType and response attributes. Audio file data can be in any of the formats supported by the audio element.

The decodeAudioData() method is preferred over the createBuffer() from ArrayBuffer method because it is asynchronous and does not block the main JavaScript thread.

Events

No events.

Related specifications

W3C Web Audio API
W3C Editor’s Draft