Who are you, and how did you get in here? I'm a locksmith, and I'm a locksmith.
- TimSchumann parentIt helps when you never question if, as in his own essay describing other ‘bad writers’ weaving falsehoods, you’re the one lying to yourself.
- I’d argue that ‘a bunch of additional code’ to solve for memory safety is exactly what you’re doing in the ‘defining memory safety away’ example with Rust or Swift.
It’s just code you didn’t write and thus likely don’t understand as well.
This can potentially lead to performance and/or control flow issues that get incredibly difficult to debug.
- It could be a time based clause. Nebula exclusive for two weeks prior to any other upload, etc
- IIRC Nebula has an exclusivity clause in their contract not allowing content creators to upload to other platforms, though I could be thinking of CuriosityStream.
- Didn’t know the people behind the mp3 format were into tooling for metalworking. Guess it makes sense, it involves a practical application for use of sound, and they are a research institution.
I wonder if the metal can hear the difference if it’s not the full 192 kHz.
- They shine through my windows at night and are truly horrific.
They’re down the entire alleyway behind my place, and a walk to the grocery store at 7pm during the winter makes your body and mind think it’s sunrise.
- Listen more than you speak.
- Please retain council.
- Adding together all the different standards/feature sets a chip supports and then aggregating the bandwidth into a single number is actually a very reasonable way to arrive at an approximation for total chip computational throughput.
Ultimately, unless the chip architecture is oversubscribed or overloaded (unsure what the right term is), the features are all meant to be used simultaneously and thus the bits being read/written have to come from somewhere.
That somewhere is a % of the total throughput of the chip.
Stated another way — people forget that there’s almost always a single piece of silicon backing the total bandwidth throughput of modern computing devices regardless of what ‘standard’ is being used.
- Yeah this is odd.
I've taken multiple 10 year old T-Shirts with holes through 10% of them in to the Patagonia store and they've let me walk out with new product off the rack.
- Wouldn’t this also imply a lack of Turing completeness, and thus not be good for general purpose computing?
- > Without CPUs, we can be freed from the tyranny of the halting problem.
Can someone please explain to me what this even means in this context?
Serious question.
- I think the author raises a good point about how much human time/energy/effort go into creating content for systems that are at best closed loops when it comes to search-ability/discovery methods, and prone to disappear completely if one company loses funding or has a policy change at worst.
Both of those seem antithetical to the free exchange of information that was the ethos of the early internet.
I also recognize the profound economic incentives that work against that free exchange of information, and I'm not offering a solution, so grain of salt and all that.
- I wonder if cosmo libc supports targeting versions.
- Made possible using Cosmopolitan Libc.
Justine writes some pretty cool software.
- For those unaware of the significance, Every Frame a Painting was a 29 episode video essay series released between 2014 and 2016 on youtube.
The channel was widely regarded as one of the best technical analysis of the cinematic/film creative process, and was dormant since 2016 though still boasts over 2 million subscribers and growing.
They recently started releasing new content to the youtube channel preceding an impending short film release from channel creators.
- Interesting post to wake up to when I'm 7 days into a fast.
Glad to be getting back into extended fasting, sad that I, again, have the body type that makes it somewhat of a requirement health wise.
- Yeah, all the research I've read says that fasting is actually preventative when it comes to cancer -- and possibly assists with starving out cancer cells during treatment of the disease.
- I haven't had that experience, but I usually do longer fasts.
My go to for this would be electrolyte levels, which can vary heavily depending on water intake, diet, exercise, supplementation, etc.
Here's some journal articles I found based on a quick search.
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1364615/ https://pubmed.ncbi.nlm.nih.gov/8777833/ https://link.springer.com/article/10.1007/s002280050682
Be interested to hear if something in one of those pops out at you.
- Generally the answer is 'no' to that question.
Medical ethics tends to prohibit it, especially in a clinical setting. It's seen as doing harm to the patients, which physicians tend to frown on.
I've even heard of some papers on fasting, retrospectives -- which I think aren't technically studies but an aggregation and analysis of patient data, collected from events that have already happened, which patients voluntarily undertook -- that were banned from journals because the results may have 'encouraged harmful behavior.' Don't have any sources on that and it's a memory from like 6-8 years ago though.
- It’s been my lived experience that Dr. Attia’s concerns about loss of muscle mass during extended fasting are extremely overblown, if not outright incorrect.
- I was unaware there exists a fully homomorphic encryption scheme that has the right trade offs between security and computational effort to make this economically viable for even moderate to small workloads.
I’ve always thought it was either far too time or far too space intensive to be practical.
Do you have sources on this, either from Apple or academic papers of the scheme they’re planning on using?
- > 1. How much commentary should I write? I try to write not too many notes because I write code, and some say code should be self-documenting. So it's the same old question of how many comments should there be along the code.
I think the best I've heard this was by a friend who said something akin to this;
Programming is the art of solving problems by codifying complexity. Generally, the self documenting part is the 'defining the problem' portion of the code. But in every problem, there's a certain amount of irreducible complexity, or it wouldn't be a problem.
There's going to be some part of the code where you wish you could make it simpler, or you wish you understood it better, or you wish you could break it down into smaller components but there's no 'good way' to do it in the system you're working in. Or, the way you have working is 'good enough' and it's not worth the investment from some (business needs angle) in making it any better.
This is the portion of the code you should comment, and document, and do so liberally and in detail.
- This is the caliber of 'news' you get when you re-work and publish a company's press release nearly verbatim without concern for pesky things like asking questions, cross-referencing, or checking sources.
From the (air quotes) article...
> The chip's architecture integrates a high-speed mesh network fabric that provides substantial bandwidth and minimal latency communication among cores, important for applications that rely on synchronized operations across multiple threads. This efficient network integration manages interactions within the chip's core array and memory systems, ensuring optimal performance without the common bottlenecks.
If anyone can tell me what that means, as it relates to the reality we live in, I'm all ears.
Also from the 'article' as it were...
>InspireSemi also stresses Thunderbird I’s energy efficiency, a carryover from its initial design for energy-sensitive blockchain computing applications.
It's an AI Pump and Dump. It all makes sense now.
- > The Cascadia subduction zone (CSZ) has hosted giant earthquakes of moment magnitude >8.5 in the past and poses a major geohazard to populations of the Pacific Northwest.
Read that part just as the train announced ‘Now entering, Beacon Hill’ and it was a touch scary thinking about what would happen in that situation.