ABSTRACT

How to Think About the Node Graph In a real-world recording studio, you route audio signals by connecting microphones and other sound sources to a sound mixer . The sound mixer is configured with its own routing scheme, which allows access to equalizers, dynamics processors, and other effects . The Web Audio API node graph is designed to mirror the characteristics of a real-world sound mixer . This is done by connecting input sources such as oscillators and audio buffers to other objects that manipulate the sonic characteristics of these input sources in some way . The various objects (including the input sources) that make up the signal chains are called nodes and are connected to one another using a method named connect() . You can think of connect() as a virtual audio cable used to chain the output of one node to

the input of another node . The final end point connection for any Web Audio application is always going to be the audioContext.destination . You can think of the audioContext.destination as the speakers of your application . This collection of connections is what is referred to as the node graph, shown in the figure below .