43
points
Hey HN,
I’m an ex-Google engineer trying to get back into music production.
I needed a way to sequence my hardware synths using AI contexts without constantly switching windows, so I built this.
It runs entirely in the browser using WebMIDI. No login required. It connects to your local MIDI devices (if you're on Chrome/Edge) and lets you generate patterns.
Tech stack: [React / WebMIDI API / etc].
Link: www.simplychris.ai/droplets
Code is a bit messy, but it works. Feedback welcome.
This was a surprising assertion to hear. Maybe on some OS, doing reliable timing is a problem. But with modern audio pipelines, things feel like they are in an extremely good state.
Just move put of focus, and you will see how it handles sending clock. I went to a hardware based, external clock signal, and using spp to force syncs between my tools, and use rtmidi+c
MIDI over MIDI cables is fundamentally not a tight protocol. If you play a four note chord there's a significant time offset between the first and last note, even with running status.
With early MIDI you had a lot of information going down a single cable, so you might have a couple of drum hits, a chord, maybe a bass and lead note all at the same moment.
Cabled MIDI can't handle that. It doesn't have the bandwidth.
Traditional 80s/90s hardware was also slow to respond because the microprocessors were underpowered. So you often had timing slop on both send and receive.
MIDI over USB should be much tighter because the bandwidth is a good few orders of magnitude higher. Receive slop can still be a problem, but much less than it used to be.
MIDI in a DAW sent directly to VSTs should be sample-accurate, but not everyone manages that. You'll often get a pause at the loop point in Ableton, for example.
The faster the CPU the less of problem this is.
If you're rendering to disk instead of playing live it shouldn't be a problem at all.
Edit: Actually midi note on events that are being sent to devices do _not_ have a timestamp! Only events that are persisted in a file may have timstamps.
Edit: my usecase is more integrating different tools and devices, Bitwig, Electribe, mixxx, my mod/protracker remix tool, etc. I guess your usecase is more to generate music, less my thing, but possible. I just have a particular sequencer/tracker use. Generation happens in bitwig
And the source: https://github.com/robrohan/r2_seq