p5 is a great library for graphics and sound on the web. RNBO is a patching environment that is exportable to the web from Max. They are both great in their own way, but what if we want to combine them? Say you have your sound elements set up directly in p5 but want to add some effects on top from RNBO.
This is what we’re going to look at in this write-up. We’ll design a simple sketch in p5 that also emits sound using the p5 sound library. And then we’ll hook this up through a reverb from one of the RNBO Guitar Pedal patches. Here’s the end result:
https://p5rnbo.superblob.studio/
Setting up p5
First things first, let’s set up our sketch and include p5. I usually prefer simple setups for creative coding (so no Webpack, Imports, and so on).
For this, we’ll need to grab the link to the libraries we want to include. I usually search on Google or cdnjs for this. After you find the links for the libraries we need, include them in your index.html file like this:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<script type="text/javascript" src="https://cdn.jsdelivr.net/npm/p5@1.7.0/lib/p5.js"></script>
<script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.8.0/addons/p5.sound.js"></script>
<script type="text/javascript" src="https://cdn.cycling74.com/rnbo/latest/rnbo.min.js"></script>
<script type="text/javascript" src="script.js"></script>
<title>p5 & RNBO</title>
</head>
<body></body>
</html>
Great, let’s then test that we can draw something with p5. Create a script.js file in the same directory and add the following:
let canvas, w, h
// this gets called once during initialization
function setup() {
w = 500 // 500px width
h = 500 // 500px height
// create a canvas for drawing, with dimensions 500x500px
canvas = createCanvas(w, h)
// make the background of the canvas yellow
background('yellow')
// fill any shapes that you draw on screen with red
fill('red')
// don't add any strokes or outlines to shapes
// by default they have a black stroke
noStroke()
// draw an ellipse at position x: 100, y: 100, with radius 50px
ellipse(100, 100, 50)
}
// this gets called every render frame
// (which is usually 60 times per second
function draw() {
}
I’ve added some explanations in the code above as to what each line does, but the core workflow of p5 sits around two functions
setup: this is where you set initial conditions and deal with things that don’t change over time
draw: this is where you deal with things that are dynamic and can change over time. This function gets called every render frame (which is usually 60 times per second, unless the frame rate is set to be different)
The above code should get you a static output looking like this. I put my drawing code under setup() because everything is static, for now.
To view this in a browser you need to create a local server or use something like https://glitch.com/ for hosting.
For creating a local server I recommend getting Visual Studio Code, which is a free code editor, and then installing the following extension: Live Server.
Once installed, a “Go Live” option should be displayed in the bottom right corner of the editor. Clicking on this will launch the local server that serves the code in that directory - the text will change to display the port used.
Open a browser and enter the URL localhost:port and it will open your sketch. So, if your port is 5500, go to localhost:5500.
Adding sound
Let’s add some sound and create a synth from p5 so we can trigger it when we press the mouse. First, we need to get the p5 audio context and have a user interaction to enable sound.
For the user interaction, let’s add a button - we can do this via HTML or we can do it in p5 as well. We’ll also create a new variable “sketchStarted” to keep track of when the audio is started, so we don’t do anything before this.
The p5 audio context is retrieved in the setup() function. If you’re not familiar with the language of p5, their documentation is great - I always search there if I need to find something.
let canvas, w, h, sketchStarted = false, context
// this gets called once during initialization
function setup() {
w = 500 // 500px width
h = 500 // 500px height
// create a canvas for drawing, with dimensions 500x500px
canvas = createCanvas(w, h)
// make the background of the canvas yellow
background('yellow')
// fill any shapes that you draw on screen with red
fill('red')
// don't add any strokes/outlines to shapes
// by default they have a black stroke
noStroke()
// draw an ellipse at position x: 100, y: 100, with radius 50px
ellipse(100, 100, 50)
// create button - the text inside the function call
// is the text displayed on screen
startButton = createButton('Start Sketch')
// position the button at the center of the screen
startButton.position(w/2, h/2)
// tell the button what function to call when it is pressed
startButton.mousePressed(resumeAudio)
context = getAudioContext() // get p5 audio context
}
// function that will be called when startButton is pressed
function resumeAudio() {
sketchStarted = true // audio is now started
// change CSS of button to hide it
// since we don't need it anymore
startButton.style('opacity', '0')
// get the audio context from p5
if (getAudioContext().state !== 'running') {
// and resume it if it's not running already
context.resume()
}
}
// this gets called every render frame
// (which is usually 60 times per second
function draw() {
}
Cool, so now we’re ready to add some sound. Let’s create a synth and trigger it when the mouse is pressed. Notice the new variable at the top to track the synth and the creation of the synth at the end of the setup() function.
p5 has a built-in function that tracks when a mouse is pressed, called “mousePressed”. I added an if statement to check if the audio is started, so we don’t trigger the synth before that.
let canvas, w, h, sketchStarted = false, context, synth
// this gets called once during initialization
function setup() {
w = 500 // 500px width
h = 500 // 500px height
// create a canvas for drawing, with dimensions 500x500px
canvas = createCanvas(w, h)
// make the background of the canvas yellow
background('yellow')
// fill any shapes that you draw on screen with red
fill('red')
// don't add any strokes/outlines to shapes
// by default they have a black stroke
noStroke()
// draw an ellipse at position x: 100, y: 100, with radius 50px
ellipse(100, 100, 50)
// create button - the text inside the function call
// is the text displayed on screen
startButton = createButton('Start Sketch')
// position the button at the center of the screen
startButton.position(w/2, h/2)
// tell the button what function to call when it is pressed
startButton.mousePressed(resumeAudio)
context = getAudioContext() // get p5 audio context
synth = new p5.MonoSynth() // create a synth
synth.setADSR(10, 1, 1, 5) // set an envelope
synth.amp(0.1) // set a lower amplitude to be careful with volumes
}
// built-in p5 function that is called when the mouse is pressed
function mousePressed() {
// check that the audio is started
if(sketchStarted == true) {
// trigger the synth at 400hz, with 90 velocity
// right now, for 0.1 seconds
// the duration gets compounded with the envelope
synth.play(400, 90, 0, 0.1)
}
}
// function that will be called when startButton is pressed
function resumeAudio() {
sketchStarted = true // audio is now started
// change CSS of button to hide it
// since we don't need it anymore
startButton.style('opacity', '0')
// get the audio context from p5
if (getAudioContext().state !== 'running') {
// and resume it if it's not running already
context.resume()
}
}
// this gets called every render frame
// (which is usually 60 times per second
function draw() {
}
Alright, we have some sounds coming in right now. They’re at a single frequency though, so let’s make the frequency dependent on the X position of the mouse.
For this, we need to map the X-coordinate of the mouse press to a note (which is between 0-127). We know that the sketch is 500px wide, so the X-coordinate can be between 0 and 500 - we’ll use the map function to create this mapping.
We’ll also need to convert the note to a frequency in Hz since that’s what the synth is expecting in the play() call. p5 has a built-in function for that, midiToFreq.
Also, on the note mapping, I am not going to use the full range, but something like 12 to 108, which is from C0 to C8.
let canvas, w, h, sketchStarted = false, context, synth
// this gets called once during initialization
function setup() {
w = 500 // 500px width
h = 500 // 500px height
// create a canvas for drawing, with dimensions 500x500px
canvas = createCanvas(w, h)
// make the background of the canvas yellow
background('yellow')
// fill any shapes that you draw on screen with red
fill('red')
// don't add any strokes/outlines to shapes
// by default they have a black stroke
noStroke()
// draw an ellipse at position x: 100, y: 100, with radius 50px
ellipse(100, 100, 50)
// create button - the text inside the function call
// is the text displayed on screen
startButton = createButton('Start Sketch')
// position the button at the center of the screen
startButton.position(w/2, h/2)
// tell the button what function to call when it is pressed
startButton.mousePressed(resumeAudio)
context = getAudioContext() // get p5 audio context
synth = new p5.MonoSynth() // create a synth
synth.setADSR(10, 1, 1, 5) // set an envelope
synth.amp(0.1) // set a lower amplitude to be careful with volumes
}
// built-in p5 function that is called when the mouse is pressed
function mousePressed() {
// check that the audio is started
if(sketchStarted == true) {
// mouseX gets the X-coordinate of the mouse press
// and maps the value from the range 0 - 500
// to 12 (C0) - 108 (C8)
let note = map(mouseX, 0, w, 12, 108)
// play the note above, with 90 velocity
// right now, for 0.1 seconds
// the duration gets compounded with the envelope
synth.play(midiToFreq(note), 90, 0, 0.1)
}
}
// function that will be called when startButton is pressed
function resumeAudio() {
sketchStarted = true // audio is now started
// change CSS of button to hide it
// since we don't need it anymore
startButton.style('opacity', '0')
// get the audio context from p5
if (getAudioContext().state !== 'running') {
// and resume it if it's not running already
context.resume()
}
}
// this gets called every render frame
// (which is usually 60 times per second
function draw() {
}
If you want this to work on the browser’s full width and height, replace the hardcoded numbers we gave to the w and h variables with the innerWidth and innerHeight properties of the window object. This should give you a larger area for the X-coordinates
w = window.innerWidth // width of the browser window
h = window.innerHeight // height of the browser window
Adding some new visuals
Instead of drawing that ellipse there, we could instead draw one on every mouse press, at the position of the mouse press. To make things fun, let’s draw with a random size. Let’s remove the ellipse line from setup() and the ellipse drawing code under mousePressed()
let canvas, w, h, sketchStarted = false, context, synth
// this gets called once during initialization
function setup() {
w = window.innerWidth // width of the browser window
h = window.innerHeight // height of the browser window
// create a canvas for drawing, with dimensions 500x500px
canvas = createCanvas(w, h)
// make the background of the canvas yellow
background('yellow')
// fill any shapes that you draw on screen with red
fill('red')
// don't add any strokes/outlines to shapes
// by default they have a black stroke
noStroke()
// create button - the text inside the function call
// is the text displayed on screen
startButton = createButton('Start Sketch')
// position the button at the center of the screen
startButton.position(w/2, h/2)
// tell the button what function to call when it is pressed
startButton.mousePressed(resumeAudio)
context = getAudioContext() // get p5 audio context
synth = new p5.MonoSynth() // create a synth
synth.setADSR(10, 1, 1, 5) // set an envelope
synth.amp(0.1) // set a lower amplitude to be careful with volumes
}
// built-in p5 function that is called when the mouse is pressed
function mousePressed() {
// check that the audio is started
if(sketchStarted == true) {
// mouseX gets the X-coordinate of the mouse press
// and maps the value from the range 0 - 500
// to 12 (C0) - 108 (C8)
let note = map(mouseX, 0, w, 12, 108)
// play the note above, with 90 velocity
// right now, for 0.1 seconds
// the duration gets compounded with the envelope
synth.play(midiToFreq(note), 90, 0, 0.1)
// draw an ellipse at the X and Y coordinates
// with a random size between 0 and 200px
ellipse(mouseX, mouseY, random(200))
}
}
// function that will be called when startButton is pressed
function resumeAudio() {
sketchStarted = true // audio is now started
// change CSS of button to hide it
// since we don't need it anymore
startButton.style('opacity', '0')
// get the audio context from p5
if (getAudioContext().state !== 'running') {
// and resume it if it's not running already
context.resume()
}
}
// this gets called every render frame
// (which is usually 60 times per second
function draw() {
}
Let’s also utilize that draw function for something. We can create a cool effect by re-drawing the background on each frame but with a very low opacity. This will create a fading effect for existing shapes. Add the following in the draw function:
function draw() {
// this will re-draw the canvas each frame (60 times per second)
// with a very low opacity
background('rgba(255, 255, 0, 0.05)')
}
Integrating with RNBO
Cool, so now we’ve got something going in p5, but the sound from that sole synth is quite rough. We can add some effects from RNBO to make it more appealing.
Exporting the RNBO Patch
First, install the RNBO Guitar Pedals library in Max. To do this, go to your package manager on the left (for me it’s the first icon on the left side).
Search for and install RNBO Guitar Pedals
Once installed, click on it and select “Launch” - it will show you the available patches.
I am going to choose the “ShimmeRev“ patch to use for this exercise.
Double click on it and it should open up a Max patch that shows the usage of the RNBO patch.
We want to get inside the RNBO patch, so double-click again on the yellow block that starts with “rnbo~“. Now we’re inside so the RNBO patch, so we can export it for the web.
Select the export option on the right and choose Web Export. Set the output directory and hit export (it’s the icon in the bottom right). Note we don’t have any dependencies in the patch so we don’t need to “Copy Sample Dependencies“.
Drop the exported files into an /export folder in your website’s directory. Depending on how you set up your development environment, you may have to upload these somewhere (like on glitch.com).
Connecting the RNBO Patch
RNBO has some placeholder code for connecting to the audio context and using it on the web. It looks something like this:
async function rnboSetup() {
// get the audio context
const WAContext = window.AudioContext || window.webkitAudioContext
context = new WAContext()
// create the output node
const outputNode = context.createGain()
outputNode.connect(context.destination)
// load your patch and device
let response = await fetch("export/yourpatch.export.json")
const yourPatch = await response.json()
yourDevice = await RNBO.createDevice({ context, patcher: yourPatch })
// connect the device to output
yourDevice.node.connect(outputNode)
context.suspend()
}
Since p5 already has an audio context, we don’t need to create another one - we can pass it to RNBO. We’ll also have two elements in our signal chain besides the output that we need to connect, the synth and the reverb patch.
p5 Synth → Reverb Patch → Output
The rnboSetup() function will look something like this now:
async function rnboSetup(context) { // pass in context from p5
const outputNode = context.createGain()
outputNode.connect(context.destination)
// load reverb patch
response = await fetch("export/rnbo.shimmerev.json")
const reverbPatcher = await response.json()
const reverbDevice = await RNBO.createDevice({ context, patcher: reverbPatcher })
// establish signal chain: p5 Synth → Reverb Patch → Output
// connect synth to reverb patch
synth.connect(reverbDevice.node)
// connect reverb patch to output
reverbDevice.node.connect(outputNode)
context.suspend()
}
Our sketch is going to look something like this right now:
let canvas, w, h, sketchStarted = false, context, synth
// setup RNBO and connect to p5 context
async function rnboSetup(context) { // pass in context from p5
const outputNode = context.createGain()
outputNode.connect(context.destination)
// load reverb patch
response = await fetch("export/rnbo.shimmerev.json")
const reverbPatcher = await response.json()
const reverbDevice = await RNBO.createDevice({ context, patcher: reverbPatcher })
// establish signal chain: p5 Synth → Reverb Patch → Output
// connect synth to reverb patch
synth.connect(reverbDevice.node)
// connect reverb patch to output
reverbDevice.node.connect(outputNode)
context.suspend()
}
// this gets called once during initialization
function setup() {
w = window.innerWidth // width of the browser window
h = window.innerHeight // height of the browser window
// create a canvas for drawing, with dimensions 500x500px
canvas = createCanvas(w, h)
// make the background of the canvas yellow
background('yellow')
// fill any shapes that you draw on screen with red
fill('red')
// don't add any strokes/outlines to shapes
// by default they have a black stroke
noStroke()
// create button - the text inside the function call
// is the text displayed on screen
startButton = createButton('Start Sketch')
// position the button at the center of the screen
startButton.position(w/2, h/2)
// tell the button what function to call when it is pressed
startButton.mousePressed(resumeAudio)
context = getAudioContext() // get p5 audio context
synth = new p5.MonoSynth() // create a synth
synth.setADSR(10, 1, 1, 5) // set an envelope
synth.amp(0.1) // set a lower amplitude to be careful with volumes
rnboSetup(context) // call RNBO setup function and pass in context
}
// built-in p5 function that is called when the mouse is pressed
function mousePressed() {
// check that the audio is started
if(sketchStarted == true) {
// mouseX gets the X-coordinate of the mouse press
// and maps the value from the range 0 - 500
// to 12 (C0) - 108 (C8)
let note = map(mouseX, 0, w, 12, 108)
// play the note above, with 90 velocity
// right now, for 0.1 seconds
// the duration gets compounded with the envelope
synth.play(midiToFreq(note), 90, 0, 0.1)
// draw an ellipse at the X and Y coordinates
// with a random size between 0 and 200px
ellipse(mouseX, mouseY, random(200))
}
}
// function that will be called when startButton is pressed
function resumeAudio() {
sketchStarted = true // audio is now started
// change CSS of button to hide it
// since we don't need it anymore
startButton.style('opacity', '0')
// get the audio context from p5
if (getAudioContext().state !== 'running') {
// and resume it if it's not running already
context.resume()
}
}
// this gets called every render frame
// (which is usually 60 times per second
function draw() {
// this will re-draw the canvas each frame (60 times per second)
// with a very low opacity
background('rgba(255, 255, 0, 0.05)')
}
So that’s about it, hope this was helpful. You can chain multiple nodes in this way, both from p5 or RNBO.