Using the Events related classes of Tone.Pattern, Tone.Part and Tone.Sequence it is possible to create polyphonic textures with each part having its own instrument sound.
Playing a round (or canon) using Tone.js is quite easy. For this example we'll use 'Hey Ho' which is 6 measure round structured into 3 two bar phrases and usually performed with three different voices. The offset for the different entry points of the voices is two measures. We'll use a different sound for each voice of the round so it is easier to distinguish the different parts. The only differences for setting up the three different parts:
If you don't set individual loop values for individual parts and instead set Tone.Transport.loopEnd = '6:0' you'll hear that when you reach the end of the sixth bar it will repeat at the beginning for heyHoPart1 but heyHoPart2 will stop and wait until bar two to start again PLUS heyHoPart3 barely got started and it will also stop on the repeat. By setting the loopEnd for each individual part, it will allow each part to complete the full melody before it starts the repeat loop. And each part's entrance is offset via the .start() value set earlier. For this example each part's loop attribute is set to '3' meaning each part will play 3 times then stop. A loop value of 'true' would have caused it to continuously loop until the stop button was clicked.
(click the stop button between plays)
function HeyHo() {
HeyHoNotes = ["D4","C4","D4","D4","D4","A3", "D4","D4","E4","E4","F4","F4","F4","F4","E4", "A4","G4","A4","G4","A4","G4","A4","G4","F4","E4"];
HeyHoDurations = ["2n","2n","4n","8n","8n","2n", "4n","4n","4n","4n","8n","8n","8n","8n","2n","4n+8n","8n","4n+8n","8n","4n+8n","8n","8n","8n","8n","8n"];
HeHoVelocity = [0.9,0.9,0.9,0.7,0.7,0.9, 0.9,0.7,0.9,0.7,0.9,0.7,0.7,0.7,0.9, 0.9,0.7,0.9,0.7,0.9,0.7,0.9,0.7,0.7,0.7];
var HeyHoMelody = Rhythm.mergeDurationVelocityAndPitch(HeyHoDurations, HeyHoNotes, HeHoVelocity);
var heyHoPart1 = new Tone.Part(function(time, value){
instruments.electricCello.triggerAttackRelease(value.note, value.duration, time, value.velocity)
}, HeyHoMelody ).start(0);
instruments.electricCello.volume.value = -5;
// offset 2 bars
var heyHoPart2 = new Tone.Part(function(time, value){
instruments.piano.triggerAttackRelease(value.note, value.duration, time, value.velocity)
}, HeyHoMelody ).start("2*1m");
// offset 4 bars
var heyHoPart3 = new Tone.Part(function(time, value){
instruments.steelPan.triggerAttackRelease(value.note, value.duration, time, value.velocity)
}, HeyHoMelody ).start("4*1m");
instruments.steelPan.volume.value = -10;
//TRANSPORT
heyHoPart1.loopStart = "0";
heyHoPart1.loopEnd = "6:0";
heyHoPart1.loop = 3;
// still play 6 bars (but start 2 bars late)
heyHoPart2.loopStart = "0";
heyHoPart2.loopEnd = "6:0";
heyHoPart2.loop = 3;
// still play 6 bars (but start 4 bars late)
heyHoPart3.loopStart = "0";
heyHoPart3.loopEnd = "6:0";
heyHoPart3.loop = 3;
Tone.Transport.bpm.value = 170;
Tone.Transport.start("+0.1");
}
A common texture for the blues is a bassline, some chordal accompaniment and a lead line. We'll try a steel drum sound for the lead melody. For this example we'll use Tone.Pattern() and Tone.Part() to assemble an arrangement. Our key is A, and the arrangement starts with a 4 bar intro on the V7#9 (E7#9) chord. Next the main bass pattern (12 bars) begins alone. On the first repeat of the 12 bar bassPattern an accompaniment enters using tritones (the 3rd and 7th of the chord) from the dominant 7th chords (A7, D7, E7) used in the 12 bar progression. During the next 12 bars the lead line comes in using a pick up measure that is common in the blues style. After once through the 12 bars with all three parts playing together, the accompaniment drops out and the bass and lead line play together for the next 12 bars. Finally the bass and lead fade out on the turnaround. Other than the intro, the bass part uses steady quarter notes so for those sections I'm using Tone.Pattern(). On all other sections I'm using Tone.Part(). The arrangement is composed of the following Tone.js classes:
The different Parts and Patterns are assemble by setting the .start() value for the different event classes so that they will come in at the correct time. Note that the .loop attribute is used with the Tone.Part() class to set the number of times through the full part. Both the lead and comping parts are played twice through so the setting is .loop = 2.
However when using Tone.Pattern() there isn't a .loop attribute, instead you need to set the .iterations attribute (default is Infinity, i.e. loop forever). We want 4 times through the 12 bar bassPattern, so we need to set a value for .iterations. But the counting is different for .iterations than for .loop values. The .iterations value is incremented every time a note is played and for the bassPattern it takes 48 iterations to complete one 12 bar section (12 bars * 4 notes / bar). Since in this arrangement the bassPattern plays 4 times through, the value for bassPattern.iterations is 48*4 (4 times through the 12 bars).
The behavior of the .mute attribute for Tone.Pattern() is also different than I expected. When you enable mute during a playback the pattern stops and then starts from that stopped location when you unmute. This is different from Tone.Part() mute behavior. Tone.Part() continues but is silent. When you unmute you hear the part at its new location as it continued forward silently. To make Tone.Pattern().mute work like Tone.Part().mute we'll have to:
So after learning of these differences between Tone.Pattern and Tone.Part with regard to loops and muting, I think a better approach would be to just use Tone.Part exclusively for this type of arrangement. But now I know some of the difference between the Tone.Pattern() and Tone.Part() classes.
I often find it difficult to program music that doesn't sound robotic on the rhythmic level. Tone.Event() and all of its descendants (Tone.Pattern() and Tone.Part() among others) have a .humanize attribute which if set to true, will apply a small (+/-0.02 seconds) random variation to the callback time. If the value is given as a time, it will randomize by that amount. I'm applying that feature (in different degrees) on all of the parts and patterns. Plus I've added a cool factor that delays the comping and lead parts by a slight amount to add a laid back quality. All this in a quest to make it a little less precise.
I've found that in order to make the fadeOut work consistently, I had to use Tone.Transport.schedule() to schedule the synth.volume.linearRampToValueAtTime() function on the Transport timeline. The .linearRampToValueAtTime() function (by itself) isn't part of the Transport timeline, but your can use .schedule() to add it to the Transport timeline. When trying to use .linearRampToValueAtTime() [without Tone.Transport.schedule()], if I changed the volume or mute/unmute during playback then the programmed fadeout didn't work. However, once I added those function calls to the Transport timeline via .schedule() the fadeout worked consistently even if muting and volume changes were made during playback.
And just a little reverb added to the mix for the finishing touch.
As mentioned frequently, the best practice is to wrap all of this code up into a javascript module so that there aren't any unnecessary global vars leaking out. But for educational purposes I've put the code in script tags on this page. View Source for details.