midi-to-lsdj development update
I’ve just released v1.5.0 of midi-to-lsdj
my little library for converting MIDI files into the commands needed to reproduce the song in Little Sound DJ (LSDJ) on the Gameboy.
What’s new
- MIDI file parsing has moved from
midi-file
to@tonejs/midi
- Moved to using
tonalJS
for working with notes - Support for tempo changes has been implemented
- Support for chords has been implemented
- Support for pitch bends via sweep commands has been implemented
- Support for drums has been implemented
- I’ve defined an command prioritisation that the library will use
Why the move from midi-file?
As @tonejs/midi
uses midi-file
under the hood I think I’d call it more of an abstraction than a move but essentially I found a library that solved a lot of the issues I was having with my v1.0.0 implementation, namely the need to resolve the delta ticks of midi-file
to absolute ticks.
@tonejs/midi
takes care of these and creates a nice little structure for holding the song data which makes it far simpler to build up the phrase, chain & table structures and removes the need to keep that MIDI parsing logic in the codebase.
An example of these quality of life upgrades is dealing with pitch bends where midi-file
returns an int
based on the MIDI implementation of pitch bends but @tonejs/midi
returns the pitch bend as float value between -2
and 2
which represents the 2 octave range that a pitch bend can have in either direction.
What does tonalJS offer?
Similar to @tonejs/midi
using@tonaljs/midi
and @tonaljs/core
give me some nice abstractions over working with MIDI and music theory (which I suck at) respectively.
I’m using @tonaljs/midi
to convert the MIDI note values to named notes (and sometimes back if I have to sort an array of notes like with chords) which means I can do away with the logic I had for doing this myself.
I’m getting the biggest usage out of @tonaljs/core
which I use for comparing intervals and distances between notes, this is really useful for things like programming tuple tables and chords.
Tempo change support
Tempo changes in LSDJ are achieved by using the T
command which takes a 00-FF
hex value to change the tempo anywhere between 40 and 295 BPM (assuming a 6 ticks per step groove).
Now I’m using @tonejs/midi
I no longer have to collect the setTempo
events myself and instead I can just iterate over a tempos
array that the library creates. Better still because the library also resolves all events to absolute ticks it’s now a lot easier to map which notes the tempo change falls on.
The only tricky part of implementing tempo changes was the way that LSDJ maps the BPM to hex. LSDJ maps 40–255 BPM as the numbers themselves ( 28
— FF
hex) but 0
— 27
hex are mapped to 256–295 BPM, not too hard to work around but some extra logic to ensure the correct value is returned.
In a later version I’ll also be working adding support for tempos outside of that 40–295 range by changing the ticks per steps in a groove as theoretically setting a groove with less ticks per step should result in a faster tempo but there’s some crazy maths that my brain isn’t up to understanding just yet.
Chord support
Chords in LSDJ are achieved by using the C
command which takes two 0-F
hex values, each one representing the interval between the root note and the other notes in the chord and then it plays an arpeggio between them.
As per the LSDJ docs
4.3 C: Chord
Runs an arpeggio that extends the base note with the given semitones. The
speed may be slowed down using CMD/RATE in instrument screen.
C37 plays a minor chord: 0, 3, 7, 0, 3, 7, 0, 3, 7, . . .
C47 plays a major chord: 0, 4, 7, 0, 4, 7, 0, 4, 7, . . .
C0C plays 0, 0, C, 0, 0, C, 0, 0, C, . . .
CC0 plays 0, C, 0, C, 0, C, . . .
CCC plays 0, C, C, 0, C, C, 0, C, C, . . .
C00 resets chord
To implement this in midi-to-lsdj
I sort the notes at that tick to using the MIDI value for the notes (as that’s a number), find the lowest note (which will be used as the root note) and then find the interval between the rest of the notes and convert those intervals to hex.
As the hex value is capped at F
I have to also cap that interval value so for instance if there’s a crazy chord that for whatever reason adds a note 3 octaves up that will be capped at 15 semitones from the root note.
Pitch bend support
There are two ways to bend a note in LSDJ — pulse or sweep. For v1.5.0 I’ve implemented them using sweep as I find that a little more universal but in later releases this may become configurable.
Sweeps in LSDJ are achieved using the S
command which takes two 0-F
hex values, the first representing the time that the sweep has to complete (I call this the speed) and the second representing the depth of the sweep.
As per the LSDJ docs
4.15 S: Sweep/Shape
This command has different effects for different instrument types.
4.15.1 Pulse Instruments
Frequency sweep, useful for bass drums and percussion. The first digit sets
time, the second sets pitch increase/decrease. Only works on the first pulse
channel.
4.15.2 Kit Instruments
S changes the loop points. The first digit modulates the offset value; the second
digit modulates the loop length. (1-7=increase, 9-F=decrease.) Used creatively,
this command can be very useful for creating a wide range of percussive and
timbral effects.
4.15.3 Noise Instruments
Alters noise shape (see section 2.6.5). The command is relative, meaning that
the digits are independently added to the active noise shape.
To implement this in midi-to-lsdj
I first had to find which pitch bend events were linked with a note as in MIDI these events are linked to a specific noteOn
event. To do this I take the pitch bend event’s absolute tick and find the nearest note being played and associate it with that.
Calculating sweep speed
In order to calculate the speed of the sweep I take the duration of the note that the sweep is being associated with and then map that to a value between 0 & 6 (indexes of an array of note durations), add 1 and times that by 2 before converting to hex.
For example:
const ppq = 480 // pulses per quarter note
const noteDuration = 960 // a half note
const noteTickLengths = [4, 2, 1, 0.5, 0.25, 0.125, 0.0625].map((duration) => ppq * duration)
const mappedNoteDuration = noteTickLengths.reduce((prev, curr) => Math.abs(curr - noteDuration) < Math.abs(prev - noteDuration) ? curr : prev)
const noteTickIndex = noteTickLengths.indexOf(mappedNoteDuration)
const sweepSpeed = convertToHex((noteTickIndex + 1)* 2).charAt(1) // convertToHex returns 0 padded two digit hex
> 4 // 1 + 1 * 2
There are some trade-offs to this, such as the speed value being capped at E
but given that it’s unlikely that there’s going to be a 1/128 note in a MIDI file I think this is ok.
Calculating sweep depth
Working through this gave me a lot of insight into how MIDI defines pitch bends, how midi-file
parses them and how @tonejs/midi
converts them.
Pitch bends in MIDI are defined by a number between 0 and 16383, with the middle value 8192 acting as no pitch bend, values below that acting as a negative pitch bend and values above it acting as a positive pitch bend. This means that essentially MIDI uses that mid-point as a way of using an unsigned int as a signed int.
Pitch bends in MIDI also have a range of -2 octaves and +2 octaves, something that @tonejs/midi
calculates for you, returning the pitch bend value as a float between -2
and 2
.
In order to calculate the sweep depth I mapped that a 0 — 2 range to 8 steps, converted the @tonejs/midi
value to an absolute number and then found the nearest step to the converted number. I then took that step, turned it negative if the original value was negative and mapped that to full range of 16 stepped values, ordered according to the LSDJ rules and return the index as hex.
For example:
const bendValue = -0.7
const pitchVals = [0, 0.3, 0.6, 0.9, 1.2, 1.5, 1.8, 2]
const fullPitchVals = [...pitchVals, 0, -2, -1.8, -1.5, -1.2, -0.9, -0.6, -0.3] // 0 is in twice but matches on first
const mappedPitchVal = pitchVals.reduce((prev, curr) => Math.abs(curr - Math.abs(bendValue)) < Math.abs(prev - Math.abs(bendValue)) ? curr : prev)
const signedMappedPitchVal = bendValue < 0 ? mappedPitchVal * -1 : mappedPitchVal
const mappedPitchValIndex = fullPitchVals.indexOf(signedMappedPitchVal)
const sweepDepth = convertToHex(mappedPitchValIndex).charAt(1) // convertToHex returns 0 padded two digit hex
> E // pitchVals = 0.6, re-signed to -0.6 which is at index 15 in fullPitchVals
The steps may seem a bit off but that’s because I’m having to account for 0 and 2 so it’s actually 0–2 mapped to 7 steps. Also the reason for the two arrays is because the negative pitch bend is mapped inverse but with 0 keeping it’s position, so I thought it was clearer to write them out instead of trying to some fancy sorting logic.
Drum support
Drums in LSDJ are mostly programmed on wave channel (although some prefer to use the noise channel) and that channel when used with a drum instrument will show the names of the drums, as well as allow for two drums to be played at the same time.
In order to support drums I first needed to know if the track was a drum track or not, fortunately @tonejs/midi
already calculates this by checking if the MIDI channel for the track is 9
or not and presents that a percussion
flag on track’s instrument object.
I think there are technically two MIDI channels that are used for drums though as when I used my drum mapping from my old Python implementation the notes were an octave higher than should have been. I think this was the channel 10
mapping so I had to remap everything down to use the channel 9
one I found here: https://pjb.com.au/muscript/gm.html#perc
I then updated the code I had for formatting notes for LSDJ to use the drum mapping if the track had that percussion
flag set and also disabled the chord command logic as there’s no need to create a chord for drums.
Note Command prioritisation in midi-to-lsdj
While adding the tempo, chord and sweep commands I realised that there needs to be a priority order for commands so that should multiple happen on one note that they don’t just overwrite each other and potentially mess up the plotting of the song.
I created an Architectural Decision Record with the following priorisation of all commands I intend to implement in midi-to-lsdj
:
- Hop note — This must take priority as this changes the structure of the song
- Tempo Change — This must take priority because the tempo changes the speed of the song and it’s unlikely that a Hop will happen on the same note
- Kill note — This has a high priority because this changes the note length which is important for songs with stop and starts, should look to move to next note if clashes with Hop or Tempo commands
- Table command — This has a high priority because this is used to add tuplet notes, while they can be skipped for any of the above commands they should be prioritised as they make for a more accurate transcription of the track
- Delay command — This has a high priority because this is used to delay notes from being run and this makes for a more accurate transcription
- Retrig command — This is somewhat important because it makes the drums more accurate but it’s less likely to clash with the other commands as they’d likely to be defined on another channel / could break out to another channel to avoid clash if song is just a drum track
- Chord command — This is somewhat important because of it makes for a more accurate transcription but chords in LSDJ could be done better by playing two pulse waves together rather than the arpeggio
- Pitch/Sweep — This isn’t that important because the note will just play without the bend
I think this order is good enough to ensure that the commands that lend themselves to a more accurate transcription degrade gracefully in order to ensure that the song structure is maintained.
What’s next?
My next priority is to more de-duplication logic to enable tracks to take up less room in LSDJ. This means that more phrases can fit into a song and it’ll mean less note entry when programming into LSDJ. It will also allow me to look at transcribing more than one track at a time which should make it a lot easier to use the web app.
Once I have the de-duplication optimising the phrases and chains I want to the provide validation on if it’s actually possible to fit the track into the LSDJ song structure to prevent someone transcribing a song that won’t fit and having to then figure out how to break it across multiple songs.
I’ve got a backlog going on Github — https://github.com/users/colinfwren/projects/1 where I’ll be managing the other improvements I’ll be making to midi-to-lsdj
.