Preferences

43 points
22 comments simplychris simplychris.ai
Hey HN,

I’m an ex-Google engineer trying to get back into music production.

I needed a way to sequence my hardware synths using AI contexts without constantly switching windows, so I built this.

It runs entirely in the browser using WebMIDI. No login required. It connects to your local MIDI devices (if you're on Chrome/Edge) and lets you generate patterns.

Tech stack: [React / WebMIDI API / etc].

Link: www.simplychris.ai/droplets

Code is a bit messy, but it works. Feedback welcome.


This does not solve the underlying problem at all, which makes today's MIDI, coming from a normal computer, almost unusable for serious sequencing. This is timing and jitter issues! So, may I asked, what is the actual use-case for this sequencer? I would like to see/hear some music you made with it. Or is this just for the sake of using AI?
PipeWire with rtkit works incredibly stably with wildly short buffer lengths (low latency). Given the short buffer size, there's not much chance for big timing issues to arise (unless there's underruns with dead air, which doesn't seem to be the case).

This was a surprising assertion to hear. Maybe on some OS, doing reliable timing is a problem. But with modern audio pipelines, things feel like they are in an extremely good state.

Actually I am using PipeWire with rtkit on Debian. But somehow it does not solve my midi problems. "Audio pipeline" is not "midi". Nevertheless I am doing all my _audio_ (not midi) work on Debian and I am very happy with it.
If you have hardware synths you are going to have a decent midi and audio interface that this is not a problem. It wasn't even a problem 25 years ago. There is no reason for consumer grade audio to be able to do this because most people will never use it.
I have maybe 20 hardware synths and I do a lot of sequencing. And yes it wasn't a problem 25 years ago, that is exactly why I still use an Atari STe! :-) But today it is a problem. It is just not possible to do complex and tight sequencing today with a normal Win, Mac or Linux computer. Even with my RME PCIe card. Your argument, "it wasn't a problem decades ago, so it cannot today either" is simply not correct.
Midi from a browser suffers from slowdownw due to have javascript is just too slow, non-threaded. There afde ways around it, but those are all workarounds.

Just move put of focus, and you will see how it handles sending clock. I went to a hardware based, external clock signal, and using spp to force syncs between my tools, and use rtmidi+c

From what I understand, midi messages can have timestamps into the future, but that implies buffering on the receiver end. Do most MIDI instruments not support enough buffering to overcome lag? Because in sequencing, the future is pretty-well known.
MIDI 1.0 messages do not have timestamps. (Sys Real Time does, but notes and controllers don't.) Timing is managed by the MIDI sender, and any buffering happens in the interface.

MIDI over MIDI cables is fundamentally not a tight protocol. If you play a four note chord there's a significant time offset between the first and last note, even with running status.

With early MIDI you had a lot of information going down a single cable, so you might have a couple of drum hits, a chord, maybe a bass and lead note all at the same moment.

Cabled MIDI can't handle that. It doesn't have the bandwidth.

Traditional 80s/90s hardware was also slow to respond because the microprocessors were underpowered. So you often had timing slop on both send and receive.

MIDI over USB should be much tighter because the bandwidth is a good few orders of magnitude higher. Receive slop can still be a problem, but much less than it used to be.

MIDI in a DAW sent directly to VSTs should be sample-accurate, but not everyone manages that. You'll often get a pause at the loop point in Ableton, for example.

The faster the CPU the less of problem this is.

If you're rendering to disk instead of playing live it shouldn't be a problem at all.

Yes, they have timestamps. But if you do buffer (or better to say, delay it), you introduce latency, which is even more worse then jitter. The ideal is 0 latency. And another downside with buffering, you would need to manifest the buffer time at all device you trigger to be the same time otherwise you do not stay in sync.

Edit: Actually midi note on events that are being sent to devices do _not_ have a timestamp! Only events that are persisted in a file may have timstamps.

These opinions are not helpful.
I wrote mine also, integrating an Akai Fire, at https://music.gbraad.nl/meister as part of a tool to do live performances. This controls some of my remix tools, mixxx and vj tools too.

Edit: my usecase is more integrating different tools and devices, Bitwig, Electribe, mixxx, my mod/protracker remix tool, etc. I guess your usecase is more to generate music, less my thing, but possible. I just have a particular sequencer/tracker use. Generation happens in bitwig

Thanks this looks intersting and I am going it to try it later. I have old Axiom 49 and it really doesn't work that much with modern DAW as it is assumed it's old and outdated. But I like the form factor and it is solid. I hope I can make it work witht his one ?
Vibe coded? Asking because it looks very similar to my vibe coded webmidi project which is a beatmatching practice for DJ’s :) https://beat.maido.io/
It definitely was made with gemini, you can tell by the fact that gemini shoe horned AI features that only work with a google api key.
Neat - here is another one you might find helpful (chrome only) https://cdn.robrohan.com/seq/index.html

And the source: https://github.com/robrohan/r2_seq

It's kind of annoying when someone shows what they're working on, and the first comment is always, "Oh yeah, here's some alternatives." It feels like less like you're trying to be helpful and more like you're just kind of cheekily crapping on them. 2 cents. Maybe it would be more helpful if you were to ask how it could be different, what it improves upon. Ask if they've seen this. Something more than, "Oh yeah, here's something in addition to what this person is trying to show."
This is pretty cool in concept. Need to go and get stuff to plug into my laptop to test :)
Does webmidi works over usb-otg? Then maybe it could run from a phone or tablet!
Yeah you can connect via USB MIDI using an OTG adapter by enabling "USB MIDI Peripheral mode" in Developer Options. There's plenty of videos on how to set it up from the Android MIDI Arranger App community - just N.B. you may need a powered USB hub depending on your use-case.
I use my tools from a linux machine (reliable) and Android (OK). I got a h4midi wc to improve the setup. Webmidi and JS is not idealz as wakelock is needed and javascript is actually slow.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal