I have yet to find a way to retract a misclick on the "flag" button, much to my annoyance.
- saulrh parentI think one obvious approach would be to assign "roles" - one person is "operations" and is the only one allowed to read the instructions to you, one person is "research" and has access to the list of ingredients, etc. But that probably bottlenecks things too hard and you have to figure out a fast way to assign roles. You could just increase the difficulty by requiring more precise instructions? Ah, you split the list of instructions into four parts and put one list in each corner of the classroom, then randomly sort people into the corners - one corner has ingredients, one corner has operations, one corner has conditionals, one corner has goals, and the class has to communicate to build valid instructions. Maybe give the ingredients tongue-twister names and make them devise ways to communicate without getting things confused. And obviously the end of the demo is "so why didn't any of you just take a list of ingredients and walk over to the list of operations so you didn't have to shout?".
- Raspberry Pi definitely works! I have a project you can take a look at; you'll have to modify it slightly since you want a keyboard rather than a joystick, but they're both HID so the majority of it should work pretty much out of the box: https://github.com/saulrh/composite-joystick.
- I know someone who uses a Twiddler full-time, and I used mine for about a month when I broke my dominant hand about a decade ago. Works very well if your hand is the right size for it.
I have a tap strap, but I use it mostly as a remote control for my TV, not as a primary input device. It probably works, but I'm not good enough with it to have the kind of error rate I'd really like.
Android has a Morse input method which would be entirely suitable for one-handed text input and there are certainly solutions for using an android phone as a keyboard, but I don't know how it'd handle things like arrow keys.
- I had those speakers for a few years before someone else noticed it, lol. The other tweeter worked just fine, and the speakers as a whole were so good that even without the dedicated hardware for higher frequencies it was still better in those ranges than what I'd been using.
I don't know how likely it'd be for something like that to turn out unsalvageable. I think that essentially everything at that level uses wooden enclosures, so it'd come down to whether the speaker bit is set into the wooden enclosure with screws or adhesive, and I don't know about the industry enough to know what the ratio is on that. Probably mostly screws. Then getting a compatible driver is probably guaranteed, at worst you have to replace both sides to keep them balanced.
- Works even if you're not in a college town. I once pulled a $4000 set of speakers out of my building's trash room - Boston Dynamics floor speakers, Polk Audio subwoofer - and I was just in a random apartment building in the bay area. Turned out the tweeter on one of the speakers needed replacing but that was like a $40 part on eBay and ten minutes of work with a screwdriver, didn't even need a soldering iron. You can get some crazy stuff if you're in the right building. Really sucks seeing it go to waste when it isn't something you can take, I always have to fight myself to leave some things behind.
- I'll point out some higher order impacts of this, since the article doesn't: Losing these forecasts will be catastrophic for American farmers. Crops literally live and die on weather forecasts; forecasts tell farmers what to plant, when to plant it, how to plant it, how to water it, and when to harvest it. Without these forecasts we will see negative effects on the entire American agricultural industry and all of the people it feeds, US citizens or not. I'm not a farmer myself, so I can't tell you exactly how severe the impact will be, but there will be an impact, whether it's an immediate crop failure and outright famine this year or "only" shockwaves bouncing through our entire economy as farmers plant the wrong crops and go out of business over the next few years. This is one of the scenarios I've been most worried about when it comes to the stability of human civilization, up there alongside the looming specter of nuclear war and the randomisation of US foreign policy.
- Tramp is perfectly able to write, it's just that it does it by writing a temp file locally and then using ssh to transfer the file to the remote, rather than installing a copy of itself on the remote and acting through that. It only uses executables that it finds on the remote. So if make and gcc and sed and such are available it's basically transparent, indistinguishable from local editing except for network round trips, and the only changes it leaves behind are the files you edit.
- The alternative is suggested by tramp, which from what I know treats the remote as a network filesystem instead of an execution host. I don't believe that tramp deploys any binaries: it reads and writes bytes over pipes and all meaningful execution happens locally. Notably, it does not achieve persistence, because there's a difference between "VSCode plugins have access while you're SSH'd in" and "VSCode plugins have access forever".
- I guess I was thinking in terms of the patches you push up to github. `jj` is a joy to use and it absolutely enables me to implement workflows that I wouldn't even vaguely consider without it helping me; the big one I think of is the one where you work in a merged dir with like six parents and use `jj absorb` to instantly spread your changes out to the different PRs. I've been forced to do that in git. It was a nightmare and took me two days. Not impossible! Just utterly impractical. `jj` takes end results that were theoretically-possible-but-practically-infeasible and makes them usable. Which I suppose counts as a new capability from the UX perspective. :P
- The thing about jj is that it doesn't actually enable any new capabilities. I tell people to use emacs or vim or vscode or whatever instead of notepad because it gives them new capabilities, things that are simply unavailable unless you're talking to an LSP or running a full-powered scripting engine. jj doesn't make anything possible the way going from notepad to a real editor does. What jj does do is it makes everything so much easier that you can now actually use all of those features that git theoretically gave you. Rebases? No more fiddling, no more wedges, no more digging through the reflog because you fat-fingered something, you just go `jj rebase` and... that's it. And if you get it wrong, you just do `jj undo` and... that's it. And if you just had a six hour manic coding marathon without committing anything and now you want to spread its batch of changes back down through your patch stack, you just do `jj absorb` and... that's it. It's not the difference between notepad and emacs where you're going from no LSP to LSP, it's the difference between emacs@2016 where LSP support is a week-long adventure and emacs@2024 where LSP support is five lines copy-pasted out of the info page.
- Laundry is kind of the perfect demo for advanced motion planning systems. Fabric is, for all intents and purposes, completely intractable in classic motion planning paradigms; it's wildly non-rigid, which means that predicting its behavior is the domain of highly specialized and expensive dynamics simulators, it's nearly impossible to invert the problem to ask what motions would be required to produce a given result, and it's highly continuous and resistant to discretization even if you can predict it. You can't make the "folds have zero width" assumption you always see when reasoning about origami, for example. Clothing is extreme even for fabric, given that it's not only highly non-uniform but also fragile; every shirt is a different hideous bit of floppy topology covered in strange textures with complex and unpredictable local properties and it'll start popping stitches if you look at it funny. Ruffles, zippers, pockets, drawstrings, the list goes on. On top of that, laundry is something that everyone does so it's relatable and easy to set up in a lab, and humans can intuitively evaluate performance with a glance. Despite all the attention, nobody's been able to demonstrate convincing performance on it in like seventy years of work, which makes it a more difficult task than backflips or shooting hoops or loading a truck. All of that together means that, when you have a fancy new algorithm that can handle more than some blocks on a tabletop, you pretty much always point it at the laundry.
- Still not "tens of millions", don't motte-and-bailey me.
Also, I thought competition was good and that we needed more of it. That's the usual fiscal-conservative line, right?
I'll further note that there are more job postings open right now than there have been at any time since 2000, that unemployment right now is incredibly low considering the pandemic and 2008, that the unemployment that still exists can be fairly easily traced to the previous trump presidency rather than any other cause, and that multiple detailed studies (refer to previous Wikipedia link) fail to find that illegal immigrants have any effect at all on the jobs or pay of American workers. Having more workers in total increases spending which opens up more jobs, for example, standard jevons paradox stuff. Your conclusions are not supported by any kind of evidence, your models do not describe or provide accurate predictions of reality, and your proposals will not work the way you think or claim they will.
- Okay. Sure. Mexico only has 120m people. You think that a third of their population walked into Texas and bought a house in Dallas? A quarter? Hell, ten percent?
Fine. I'll bring some of my own statistics. There might be ten million undocumented immigrants living in the United States total. There are fewer than half a million illegal border crossings a year; if the expected lifespan following an illegal border crossing is, I don't know, forty years, then it's obvious that the overwhelming majority of illegal border crossings don't convert to undocumented immigrants. These numbers are easily available on the relevant Wikipedia page: https://en.wikipedia.org/wiki/Illegal_immigration_to_the_Uni..., which itself has extensive citations from a wide variety of sources. Saying that there are "tens of millions crossing the border" is clearly and blatantly incorrect.
And, of course, that's not even getting into the real meat of the issue, that's just sarcastically calling out the surface-level lies. No, what I really want to say about illegal immigration is that undocumented immigrants commit fewer crimes than either documented immigrants or outright citizens, that they pay more taxes than they cost in government spending, that they do not affect job access or pay of legal residents, that they prevent offshoring, and that they contribute to GDP via spending and labor. Undocumented immigrants are, as far as I can tell, purely positive contributors to America at every level I look at, for the people working alongside them and going to school with them all the way up to the grandest statistics. If we truly wanted a healthy economy - if we wanted more citizens to have better jobs, if we wanted more money for education and healthcare, if we wanted less crime and less exploitation of labor - we would legalize all of them and invite more in after them.
- It did host a successful and substantially-satisfying human civilization, at least until it let a couple of presumptuous self-important anarchoprimitivists kill it and genocide its subjects. Even if it was only a temporary and unstable illusion of alignment, that's one more values-satisfying civilization than the overwhelming majority of paperclippers manage. So yeah. Good? No. Least evil? Maybe.
- > Prime Intellect
Ah, yes, Prime Intellect, the AGI that went foom and genocided the universe because it was commanded to preserve human civilization without regard for human values. A strong contender for the least evil hostile superintelligence in fiction. What a wonderful thing to name your AI startup after. What's next, creating the Torment Nexus?
(my position on the book as a whole is more complex, but... really? Really?)
- I mean, if there's no mechanism for deploying firmware updates, they took off the jtag headers or burned some fuses or blew a programmable-once ROM, there may be no way to deploy updates short of dismantling the car and swapping out a couple major boards. That might well count as "can't".
- It's probably way more complicated than that. Ever seen the cold start disaster recovery procedure for a big system with identity and encryption-at-rest and message busses involved? You might be lucky if the bring-up doesnt have any individual stages that take a week to quiesce all by themselves. I know that this system probably isn't all that big, but if I assume their server-side software is as low-quality as their embedded software, I can easily imagine it being that complex and interdependent and poorly documented.
I'm surprised - I'd have expected the facility's locks to be guaranteed to be unacceptable so as to minimize the insurance company's payouts. Insurance agencies already do worse on a daily basis, this level of consumer-hostile bullshit would barely even register.If you use the disc lock the storage facility sells, you'll likely pay an additional markup on it, but it's also guaranteed to be acceptable to their partner insurance company.- I totally forgot one! A really important one!
* Pound the Table (https://forums.sufficientvelocity.com/threads/pound-the-tabl...): X-men courtroom drama, written by a practicing attorney who brings the reader along with their own clear love for their work, using it as a lens through which to examine the real-life social issues that the X-Men franchise has always concerned itself with. It's really good. Seriously. I can't believe I forgot this one.
- Every time I hear about the review processes for browser extensions I'm shocked that the it involves humans having to read your README and manually plumb together the build process. Sometimes I hear that reviewers are even reusing VMs when doing reviews, or even not using VMs at all. I'd have expected the review form to have a textbox where you paste your git link and a well-documented automated pipeline that stands up a specified VM with a specified amount of RAM and disk, clones the git, descends into it, and executes `docker build -t ./docker/review/Dockerfile`. I'm surprised that the reviewers themselves haven't outright demanded such tooling from their larger organization, just as a matter of job satisfaction - I can't imagine all the abuse they get from angry app owners.