- bcoatesI thought that, aside from being among the least visually appealing mass-produced cars in history, the Aztek was pretty well received -- basically an early version of the "the American lusts for some combination of a Gremlin and a Wagoneer" idea
- Is reverse dns even a thing outside of irc and forgetting to give command line tools the "don’t be slow" flag?
- The taste/texture of jello is just collagen (roughly, "meat stew flavor"), fruit juice, and (tons of) sugar. It’s just an extremely heightened version of natural flavors. There is nothing new under the sun.
- I think it's an unavoidable consequence of the space constraints they're working with.
On the plus side, when I dayroomed there it was dead silent and the room had blackout curtains.
- Sony released an angry birds movie more than 5 years after peak interest, they're a trailing indicator
- Memory pressure (and a lot of other overload conditions) usually makes latency worse--does that show up in your system? Latency backpressure is a pretty conventional thing to do. You're going to want some way to close the loop back to your load balancer, if you're doing open-loop control (sending a "fair share" of traffic to each node and assuming it can handle it) issues like you describe will keep coming up.
This is a Hard Problem and you might be trying to get away with an unrealistically small amount of overprovisioning.
- It doesn’t have to be and almost certainly isn't some billionaire. Formulaic spicy political nonsense is reliable engagement bait and it's easy to churn eyeballs into (small amounts of) money. It's not even unique, there are similar grinds about sports, religion, cute animals, subculture jokes, etc.
The "control the narrative" stuff is mostly a PR campaign by social media intelligence companies trying to make their services seem more valuable than they are.
- "Unlike New York, Chicago or L.A., which each have one, maybe two, San Francisco and the greater Bay Area have over two dozen"
Whaaa...? Los Angeles has a whole rat's nest of overlapping agencies, (mostly different cities and like 4 kinds of train for some reason)
- A thing I've been noticing across the board is that current generative AI systems are horrible at composition. It’s most obvious in image generation models where the composition and blocking tend to be jarringly simple and on point (hyper-symmetry, all-middleground, or one of like three canned "artistic" compositions) no matter how you prompt them, but you see it in things like text output as well once you notice it.
I suspect this is either a training data issue, or an issue with the people building these things not recognizing the problem, but it's weird how persistent and cross-model the issue is, even in model releases that specifically call out better/more steerable composition behavior.
- That phrase template isn’t just overdone—it's something some text models are obsessed with. The em-dashes, the contrastive language—these are signs of LLMs being asked to summarize or expand a compelling blog post.
- Pumped out water has to go somewhere . With the airgap, it will either back out your garbage disposal or pour out your airgap into the sink basin, depending on the location of the blockage.
The airgap causes the pump to be physically incapable of backfeeding the drinking water supply with dishwasher waste
- I think he said sharing a circuit with a fridge, which are generally 110 in the US -- i think this is how my apartment is wired (2-phase 30A to oven dedicated, one 20A for the whole rest of kitchen)
Trying to run a resistive heater on the same circuit as a fridge compressor without tripping leans towards very conservative wattage
- The only complaint I have about sharpies is the crap ergonomics on the barrel, causing people to wrap tape around it just to get a nice grip, which inevitably gets super gross:
https://www.science.org/do/10.1126/science.aan7026/full/lanl...
- Yes, I was hoping to see some actual insight on this Domestic Manufacturing Miracle but it seems to just be "if you build it they will come"
This flies in the face of the more than one person I know personally that tried to take stranded US based manufacturing assets and turn them into something with a future. So far, no luck
I still believe there is upside in this space over the next decade or so but I haven't met anyone who's won in a repeatable way yet.
- I don't get it -- AWS deep archive is $12/TB/yr and provides actual durability and connectivity, not just drive-in-a-shoebox. That seems pretty hard to beat by buying raw storage at retail
- It's been closer to 100 years since we figured out information theory and discredited this idea (that continuous/analog processes have more, or different, information in them than discrete/digital ones)
- Also the persuasion paper he links isn't at all about what he's talking about.
That paper is about using persuasion prompts to overcome trained in "safety" refusals, not to improve prompt conformance.
- I think they're understating the thread safety risks here. The import is going to wind up happening at a random nondeterministic time, in who knows what thread holding who knows what locks (aside from the importer lock).
Previously, if you had some thread hazardous code at module import time, it was highly likely to only run during the single threaded process startup phase, so it was likely harmless. Lazy loading is going to unearth these errors in the most inconvenient way (as Heisenbugs)
(Function level import can trigger this as well, but the top of a function is at least a slightly more deterministic place for imports to happen, and an explicit line of syntax triggering the import/bug)
- I don't think there is any solution for that but "fix your broken linter".