Preferences

jpc0
Joined 816 karma
meet.hn/city/za-Boksburg

Socials: - x.com/jeanpierrec19

Interests: AI/ML, Books, Climbing, DevOps, Hacking, Hardware, Hiking, IoT, Music, Networking, Open Source, Programming, Running, Web Development

---


  1. > A hard lock which requires a reboot or god forbid power cycling is the worst possible outcome

    Hilariously this happens on windows too.

    Actually everything you said windows and mac doesn't do they do, if you put on a ton a memory pressure the system becomes unresponsive and locks up...

  2. I would guess solved but bot easy.

    WebRTC makes it possible by timing is still limited to NTP who has is far above one sample, you couldn’t possibly get sample accurate playback but you could get it to within a mS or so.

    Depending on how far away your sources are that might be fine, for instance two speakers in two rooms where you won’t get significant phase issues this is trivial to do(well trivial is an understatement but you can do it purely with web technologies).

  3. That clarifies a lot.

    So effectively it was at least partly guided refactoring. Not blind vibe coding.

  4. This.

    I agree that you will probably just end up writing ASM but it was a trivial example, there are non-trivial examples involving jump tables and unrolling loops etc.

    Effectively weird optimisations that rely on the virtual machine the compiler is building for vs reality, there's just more abstractions with rust than with C++ by the virtue of the safety mechanism, it's just plain not possible to have the one without the other.

    The hardware can do legal things that rust cannot allow or can allow but you need to write extremely convoluted code, C/C++ is closer to the metal in that regard.

    Don't get me wrong I am all for the right abstractions, it allows insane optimisations that humans couldn't dream of, but there is a flip side.

  5. Reading the code and actually understanding the code is not that the same thing.

    "This looks good", vs "Oh that is what this complex algorithm was" is a big difference.

    Effectively, to review that the code is not just being rewritten into the same code but with C++ syntax and conventions means you need to understand the original C code, meaning the hard part was not the code generation (via LLM or fingers) but the understanding and I'm unsure the AI can do the high level understanding since I have never gotten it to produce said understanding without explicitly telling it.

    Effectively, "x.c, y.c, z.c implements a DSL but is convoluted and not well structured, generate the same DSL in C++" works great. "Rewrite x.c, y.c, z.c into C++ buildings abstractions to make it more ergonomic" generally won't recognise the DSL and formalise it in a way that is very easy to do in C++, it will just make it "C++" but the same convoluted structure exists.

  6. Rust prevents the footgun, but also prevents shooting in the direction where your foot would be even if it isn't there.

    There are absolutely places where that is required and in Rust those situations become voodoo to write.

    C++ be default has more complexity but has the same complexity regardless of domain.

    Rust by default has much less complexity, but in obscure situations outside of the beaten path the complexity dramatically ramps up far above C++.

    This is not an argument for or against either language, it's a compromise on language design, you can choose to dislike the compromise but that doesn't mean it was the wrong one, it just means you don't like it.

    A relatively simple but complex example, I want variable X to be loaded into a registerer in thos function and only written to memory at the end of the function.

    That is complex in C/C++ but you can look at decompilation and attempt to coerce the compiler into that.

    In rust everything is so abstracted I wouldn't know where to begin looking to coerce the compiler into generating that machine code and might just decide to implement it in ASM, which defeats the point of using a high level language.

    Granted you might go the FFMPEG route ans just choose to do that regardless but rust makes it much harder.

    You don't always need that level of control but when you do it seems absurdly complex.

  7. Are you sure claude didn't do exactly the same thing but the harness, claude code, just hid it from you?

    I have seen AI agents fall into the exact loop that GP discussed and needed manual intervention to fall out of.

    Also blindly having the AI migrate code from "spaghetti C" to "structured C++" sounds more like a recipe for "spaghetti C" to "fettuccine C++".

    Sometimes its hidden data structures and algorithms you want to formalize when doing a large scale refactor and I have found that AIs are definitely able to identify that but it's definitely not their default behaviour and they fall out of that behaviour pretty quickly if not constantly reminded to do so.

  8. > Which is a huge risk factor for Rust, especially in today's context of the Linux kernel. If I have an object created/handled by external native code, how do I make sure that it respects Rust's lifetime/aliasing rules?

    Can you expand on this point? Like are you worried about whether the external code is going to free the memory out from under you? That is part of a guarantee, the compiler cannot guarantee what happens at runtime no matter what the author of a language wants, the CPU will do what it's told, it couldn't care about Rusts guarantees even if you built your code entirely with rust.

    When you are interacting with the real world and real things you need to work with different assumptions, if you don't trust that the data will remain unmodified then copy it.

    No matter how many abstractions you put on top of it there is still lighting in a rock messing with 1s and 0s.

  9. This is more akin to selling a car to an adult that cannot drive and they proceed to ram it through their garage door.

    It's perfectly within the capabilities of the car to do so.

    The burden of proof is much lower though since the worst that can happen is you lose some money or in this case hard drive content.

    For the car the seller would be investigated because there was a possible threat to life, for an AI buyer beware.

  10. I don't think that's entirely true. Seeking mastery does not imply being a master.

    If you have only ever seen one pattern to solve a problem, trivial example of inheritance, and therefore do that to the best of your ability then you have achieved mastery to your ability. Once you see a different pattern, composition, you can then master that and master identifying when which is suitable.

    Lack of mastery is just using inheritance despite seeing alternative patterns.

    Naturally mastery also includes seeking alternative solutions but just because a codebase uses inferior patterns does not mean those that came before did not strive towards mastery, it's possible that they didn't know better at the time and now cannot get the time to revise the work.

    There's always a juggling act in the real world.

    Assume incompetence and not malice, and incompetence is not a state of being. A person without experience can be seen as incompetent but quickly become competent with training or experience, but the code they write still stems from incompetence.

    Strive to see your previous self as incompetent (learn something new every day)

  11. Could you write us a nice blog post or article with performance metrics to prove this?

    You might be correct but at this point your statement is as much a lie as the parent.

  12. Because number bigger doesn’t translate to higher perceived performance…

    The only compelling reason that I want to upgrade my Sandy Lake chip is AVX2.

    So it is instruction set not perf, sure there will be improved performance but most of the things that are actually performance issues is already handed off to the GPU.

    On that note probably rebar and PCIe4 but those aren’t dramatic differences, if CPU is really a problem (renders/compilation) then it gets offloaded to different hardware.

  13. Big O notation drops the coefficient, sometimes that coefficient is massive enough that O(N) only beats out O(N^2) at billions of iterations.

    Premature optimisation is a massive issue, spending days working on finding a better algorithm is many times not with the time spent since the worse algorithm was plenty good enough.

    Real world beats algorithmic complexity many many times because you spent ages building a complex data structure with a bunch of heap allocations all over the heap to get O(N) while it's significantly faster to just do the stupid thing that is in linear memory.

  14. I’ll take the fight on Algorithmic complexity any day.

    There are many cases where O(n^2) will beat O(n).

    Utilising the hardware can make a bigger difference than algorithmic complexity in many cases.

    Vectorised code on linear memory vs unvectorised code on data scattered around the heap.

  15. AES67 is the open standard, ravenna and dante are extensions/alterations of it (well dante preceded aes67 but can be run in aes67 compatibility) and none of them require FPGAs, they usually use FPGAs to keep latency very low but they work just fine with any network card that supports PTPv2 and in dante case it’s not even that strict.

    Go grab the ravenna docs, it’s pretty close to the spec for AES67 with added details for how to communicate metadata. You will find it, SMPTE2110 and the likes is all built on-top of existing standards (RTP, PTP, amongst others), even AVB which has much stricter requirements regarding latency is the same. These aren’t complex proprietary standards, they are standards which just specifies restrictions and interactions between other standards.

    What I’m getting at is the Klark Teknik and Behringer after then refusing to use these standards as their interconnect is the industry outlier, the only other example in this discussion which still has relevance is Allen and Heath and they now do actually support Dante stage boxes on their models.

    Twinlan and the other examples were never the only options, Digico and soundcraft support madi by default, Yamaha effectively spurred dante into existence in the live industry. Their proprietary protocol are there to solve problems that cannot be solved with the standard interconnect, usually latency or channel count or both.

  16. I don’t know how they do it but in general the big thing you need to think about is how to handle reactivity.

    There is no reason you cannot implement MVVC/MVC or whatever else on JS for reactivity, it’s just that react et al just abstracts that away for you.

    You are effectively choosing some method of implementing and abstracting the observer pattern and handling cleanup since you can easily get into a situation where you have dangling references to objects that are never cleaned up (memory leaks) or pretty nasty GC hangs which can make the UI feel bad to use.

  17. 2.4GHz wifi at 40MHz squats literally half of the usabke channels, you speed improvement, very likely you now get 100mbps. If you just disabled 2.4GHz and forced 5GHz you would get the exact same improvement and wouldn't be polluting half of the available frequencies.

    Add another idiot sitting on channel 8 or 9 and the other half of the bandwidth is also polluted, now even your mediocre IoT devices that cannot be on 5GHz are going to struggle for signal and instead of the theoretical 70/70mbps you could get off a well placed 20MHz channel you are lucky to get 30.

    Add another 4 people are you cannot make a FaceTime call without disabling wifi or forcing 5GHz

  18. So what happens when the env value happens to actually have a # in it?

    So you then need to implement escaping which can go from a very simple implementation to an actual lookahead parser

    EDIT:

    Actually I agree, this parser is already very overbuilt and should be able to handle comments. Generally an env parser is a few lines a best… you need to read a line, look for the first instance of the separator, generally no reason to build a full parser for that, env is an absurdly simple format if you don’t want features like comments.

    Hell even an ini format parser is simple to implement in the same style.

  19. I'm in the reat of the world. Trust me I understand how entrenched WhatsApp is but realistically point to the viable alternative.

    And WhatsApp for "the rest of the world" is "free", about as free as Gmail and Facebook but monetarily free. It's hard to argue a monopoly when there is no money trading hands, and for business you are free to contact your customers via Email or SMS or whatever other form you would like, I can tell you there is benefits to using WhatsApp, our stats show much higher engagement and well we can actually get more information about message delivery than other platforms, you pay a premium for that and they gatekeep that because it has business benefit to do so.

    If WhatsApp campaigns didn't get higher engagement than email or sms which is cheaper we wouldn't pay the premium for it, everyone who has WhatsApp can also receive SMS.

    Does that help clarify why I'm arguing WhatsApp isn't a monopoly? It's kind of ranty, I apologise for that.

  20. WhatsApp is far from a monopoly and I wouldn’t call their api developer hostile, it’s actually reasonable to work with, what you can do with it though is quite restricted to what a business would want to do with WhatsApp and is billed accordingly.

This user hasn’t submitted anything.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal