OfflineContext

Wrapper around the OfflineAudioContext


// generate a single channel, 0.5 second buffer
const context = new Tone.OfflineContext(1, 0.5, 44100);
const osc = new Tone.Oscillator({ context });
context.render().then(buffer => {
	console.log(buffer.numberOfChannels, buffer.duration);
});

Hierarchy

Constructor

new OfflineContext (
channels:number ,

The number of channels to render

duration:Seconds ,

The duration to render in seconds

sampleRate:number

the sample rate to render at

) => OfflineContext
new OfflineContext (
context:OfflineAudioContext
) => OfflineContext

Properties

clockSource #

TickerClockSource

What the source of the clock is, either "worker" (default), "timeout", or "offline" (none).

currentTime #

readonly Seconds

Same as this.now()

debug #

boolean

Set this debug flag to log all events that happen in this class.

destination #

Destination

A reference to the Context's destination node.

disposed #

readonly boolean

Indicates if the instance was disposed. 'Disposing' an instance means that all of the Web Audio nodes that were created for the instance are disconnected and freed for garbage collection.

draw #

Draw

This is the Draw object for the context which is useful for synchronizing the draw frame with the Tone.js clock.

latencyHint #

readonly ContextLatencyHint | Seconds

The type of playback, which affects tradeoffs between audio output latency and responsiveness. In addition to setting the value in seconds, the latencyHint also accepts the strings "interactive" (prioritizes low latency), "playback" (prioritizes sustained playback), "balanced" (balances latency and performance).


// prioritize sustained playback
const context = new Tone.Context({ latencyHint: "playback" });
// set this context as the global Context
Tone.setContext(context);
// the global context is gettable with Tone.getContext()
console.log(Tone.getContext().latencyHint);

listener #

Listener

The listener

lookAhead #

Seconds

The amount of time into the future events are scheduled. Giving Web Audio a short amount of time into the future to schedule events can reduce clicks and improve performance. This value can be set to 0 to get the lowest latency.

rawContext #

readonly AnyAudioContext

The unwrapped AudioContext or OfflineAudioContext

sampleRate #

readonly number

The current time in seconds of the AudioContext.

state #

readonly AudioContextState

The current time in seconds of the AudioContext.

transport #

Transport

There is only one Transport per Context. It is created on initialization.

updateInterval #

Seconds

How often the interval callback is invoked. This number corresponds to how responsive the scheduling can be. context.updateInterval + context.lookAhead gives you the total latency between scheduling an event and hearing it.

static version #

string

The version number semver

Methods

addAudioWorkletModule #

Add an AudioWorkletProcessor module

addAudioWorkletModule (
url:string ,

The url of the module

name:string

The name of the module

) => Promise<void >

clearInterval #

Clear the function scheduled by setInterval

clearInterval (
id:number
) => this

clearTimeout #

Clears a previously scheduled timeout with Tone.context.setTimeout

clearTimeout (
id:number

The ID returned from setTimeout

) => this

close #

Close the context

close ( ) => Promise<void >

createAnalyser #

createAnalyser ( ) => AnalyserNode

createAudioWorkletNode #

Create an audio worklet node from a name and options. The module must first be loaded using addAudioWorkletModule.

createAudioWorkletNode (
name:string ,
options?:Partial<AudioWorkletNodeOptions >
) => AudioWorkletNode

createBiquadFilter #

createBiquadFilter ( ) => BiquadFilterNode

createBuffer #

createBuffer (
numberOfChannels:number ,
length:number ,
sampleRate:number
) => AudioBuffer

createBufferSource #

createBufferSource ( ) => AudioBufferSourceNode

createChannelMerger #

createChannelMerger (
numberOfInputs?:number | undefined
) => ChannelMergerNode

createChannelSplitter #

createChannelSplitter (
numberOfOutputs?:number | undefined
) => ChannelSplitterNode

createConstantSource #

createConstantSource ( ) => ConstantSourceNode

createConvolver #

createConvolver ( ) => ConvolverNode

createDelay #

createDelay (
maxDelayTime?:number | undefined
) => DelayNode

createDynamicsCompressor #

createDynamicsCompressor ( ) => DynamicsCompressorNode

createGain #

createGain ( ) => GainNode

createIIRFilter #

createIIRFilter (
feedForward:number [] | Float32Array ,
feedback:number [] | Float32Array
) => IIRFilterNode

createMediaStreamDestination #

createMediaStreamDestination ( ) => MediaStreamAudioDestinationNode

createMediaStreamSource #

createMediaStreamSource (
stream:MediaStream
) => MediaStreamAudioSourceNode

createOscillator #

createOscillator ( ) => OscillatorNode

createPanner #

createPanner ( ) => PannerNode

createPeriodicWave #

createPeriodicWave (
real:number [] | Float32Array ,
imag:number [] | Float32Array ,
constraints?:PeriodicWaveConstraints | undefined
) => PeriodicWave

createStereoPanner #

createStereoPanner ( ) => StereoPannerNode

createWaveShaper #

createWaveShaper ( ) => WaveShaperNode

decodeAudioData #

decodeAudioData (
audioData:ArrayBuffer
) => Promise<AudioBuffer >

dispose #

Clean up. Also closes the audio context.

dispose ( ) => this

emit #

Invoke all of the callbacks bound to the event with any arguments passed in.

emit (
event:any ,

The name of the event.

...args:any []

The arguments to pass to the functions listening.

) => this

getConstant #

Internal Generate a looped buffer at some constant value.

getConstant (
val:number
) => AudioBufferSourceNode

static getDefaults #

Returns all of the default options belonging to the class.

getDefaults ( ) => ContextOptions

immediate #

The current audio context time without the lookAhead. In most cases it is better to use now instead of immediate since with now the lookAhead is applied equally to all components including internal components, to making sure that everything is scheduled in sync. Mixing now and immediate can cause some timing issues. If no lookAhead is desired, you can set the lookAhead to 0.

immediate ( ) => Seconds

static mixin #

Add Emitter functions (on/off/emit) to the object

mixin (
constr:any
) => void

now #

Override the now method to point to the internal clock time

now ( ) => Seconds

off #

Remove the event listener.

off (
event:"statechange" | "tick" ,

The event to stop listening to.

callback?:undefined | (args) => void

The callback which was bound to the event with Emitter.on.If no callback is given, all callbacks events are removed.

) => this

on #

Bind a callback to a specific event.

on (
event:"statechange" | "tick" ,

The name of the event to listen for.

callback:(args) => void

The callback to invoke when the event is emitted

) => this

once #

Bind a callback which is only invoked once

once (
event:"statechange" | "tick" ,

The name of the event to listen for.

callback:(args) => void

The callback to invoke when the event is emitted

) => this

render #

Render the output of the OfflineContext

render (
asynchronous= true:boolean

If the clock should be rendered asynchronously, which will not block the main thread, but be slightly slower.

) => Promise<ToneAudioBuffer >

resume #

Starts the audio context from a suspended state. This is required to initially start the AudioContext. See Tone.start

resume ( ) => Promise<void >

setInterval #

Adds a repeating event to the context's callback clock

setInterval (
fn:(args) => void ,
interval:Seconds
) => number

setTimeout #

A setTimeout which is guaranteed by the clock source. Also runs in the offline context.

setTimeout (
fn:(args) => void ,

The callback to invoke

timeout:Seconds

The timeout in seconds

) => number
ID to use when invoking Context.clearTimeout

toString #

Convert the class to a string


const osc = new Tone.Oscillator();
console.log(osc.toString());
toString ( ) => string