- stefan_30% of code written by AI, but 100% of tools must be enshittified with the terrible and behind Microsoft Copilot even if it means you will blow up the goodwill for VS Code in a matter of months
- I don't understand. Robust markets don't have large margins. Why would a regulator even want markets with enormous margin? That's usually market failure.
- This is a great new "you can land on the moon with 10 MHz".
- The peak of irony, because you know how these people arrived at their 40 Mbit bitrate H264 and their ineffective tinkering with the same in the first place is guaranteed to be some LLMs expert suggestions. As is often the case, because they had no understanding of the really complex system subject matter whatsoever, they were unable to guide the LLM and ended up with .. slop. Which then turned into a slop blog post.
God knows what process led them to do video streaming for showing their AI agent work in the first place. Some fool must have put "I want to see video of the agent working" in.. and well, the LLM obliged!
- I don't know how to comment on this considering you don't seem to know that the majority of research staff salaries in highly successful labs is paid entirely through grant money.
- The ratio has just been going up and up and up, and to suggest it pays for "equipment, supplies and data resources" is a bad joke considering the people doing the work end up saddled with yet more administrative bloat in the form of hostile, complicated processes in accessing the funds to buy the very equipment and supplies that enables the research.
- If universities fund it themselves they might forego some of the usual 30% administrative grift and we get some 40 projects out of it!
- It's ARIN, this is essentially their only job
- Well its sure gonna get the filesize down though, great HECV -> AV1 transcoding success..
- I'm curious how these fellas took something like IP block allocation and turned it into an Excel based workflow.
- DRM aside, Spotify clearly should have logic that throttles your account based on requests (only so many minutes in a day..), making it entirely impractical to download the entirety of it unless you have millions of accounts.
- Three levels down and people have entirely forgotten what my post was. My "server" is some anemic ARM core built into real physical hardware with 64M of read-only storage. I don't want it spending its time "hydrating" some DOM, I don't want to bring any of this frontend insanity on there at all. No code hosted on npm shall ever run on that processor or I can't go to sleep in peace.
So how do we still get a fancy SPA website? Build it all down to a simple zip bundle, the ARM can serve those static files just fine. The SPA talks to the ARM via a few JSON APIs. Very nice clean boundary.
- HTML is the last thing I would ever want to generate on my embedded device, it's a terribly verbose string-based mess invariably coupled with stylistic choices. Which is why my servers don't generate any of that, they serve static files - and any interactive information in something that looks a lot more like an interface definition.
- It's terrible, why would I want my endpoints to return random HTML fragments? I realize thats how you did it in the JQuery times, but I was never in those - at that time we simply had template engines in the backend so this HTML slop wouldn't contaminate everywhere..
Most of the frontend stuff I do is for internal pages on embedded devices, and I'm very happy with a structure where I have the frontend being a full React fancy component lib SPA that is eventually just compiled down to a zip bundle of files that the backend needs to know nothing about and can serve as dumb files. The backend is a JSON API of sorts that I would need to build anyway for other use cases.
- It is made illegal. As the post notes, you need to (1) give notice and (2) data collected needs to be made available in a user access request and (3) deleted irrevocably on request. You must have a legitimate reason to process and store this data (scattershot forwarding to everyone is a prima facie violation). Unless you comply with all of these, you are in violation.
- I'm the first to say they should have been shut down the day the original deadline ran out, and if new leadership comes to the WH they should aggressively prosecute all the platforms that broke the law under promises of the corrupt DOJ (Google, Apple et al). But that's between your joke of a constitution and political leadership, it hardly sways the case one way or another.
- I don't get it, did you miss that this went all the way to the Supreme Court already? It's not "anti free speech", it's "anti Chinese platform".
- I'm sure theres a difference in the binary, for a real comparison you would need to compile the same coreutils version with the same options.
I just think the assertion that "compute-heavy" tools like sha256sum would be especially affected by Fil-C is not true, and if that was true given the "baseline slowdown" of 4x, surely it would show up in this sloppy test.
- I tried md5sum and sha256sum and there was exactly zero difference in runtime (the Fil-C version of sha256sum was consistently faster, in fact..)
- There is no "arbitrary code execution and all kinds of nasty" in the Fil-C version and it profits from the decades spent fixing all the logic bugs, races, environment variable mess in coreutils.
Meanwhile, the Rust version of course is vulnerable to all of those: https://ubuntu.com/security/notices/USN-7867-1