- > For years, despite functional evidence and scientific hints accumulating, certain AI researchers continued to claim LLMs were stochastic parrots: probabilistic machines that would: 1. NOT have any representation about the meaning of the prompt. 2. NOT have any representation about what they were going to say. In 2025 finally almost everybody stopped saying so.
It's interesting that Terrence Tao just released his own blog post stating that they're best viewed as stochastic generators. True he's not an AI researcher, but it does sound like he's using AI frequently with some success.
"viewing the current generation of such tools primarily as a stochastic generator of sometimes clever - and often useful - thoughts and outputs may be a more productive perspective when trying to use them to solve difficult problems" [0].
- This is what seemingly every app does. They add 15 different categories for notifications / emails / whatever, and then make you turn off each one individually. Then they periodically remove / add new categories, enabled by default. Completely abusive behavior.
- I have to say, it was fun while it lasted! Couldn't really have asked for a more rewarding hobby and career.
Prompting an AI just doesn't have the same feeling, unfortunately.
- So the brain is a mathematical artifact that operates independently from time? It just happens to be implemented using physics? Somehow I doubt it.
- > And they can't even write a single proper function with modern c++ templating stuff for example.
That's just not true. ChatGPT 4 could explain template concepts lucidly but would always bungle the implementation. Recent models are generally very strong at generating templated code, even if its fairly complex.
If you really get out into the weeds with things like ADL edge cases or static initialization issues they'll still go off the rails and start suggesting nonsense though.
- That's standard in the games industry as well. Plus many others like no rtti, no huge dependencies like boost, no smart pointers, generally avoid ctors / dtors, etc.
- I also enjoy my morning ritual of preparing the grinds and brewing a fresh pot. But I'll be honest, at the end of the day it doesn't really matter where I get it -- brunch at a nice restaurant, Starbucks, McDonalds, a cheap hotel buffet, lukewarm from a flight attendant ... as long as I get it. Sounds healthy, right?! ;)
- At the very least, modern cars are much heavier and ultimately mass wins. For example, a 2005 Honda CRV weights 3400 lbs while a 2025 is 3900 lbs.
Plus they have tons more auxiliary safety features like lane departure warning, forward collision warning, blind spot detection, better visibility, etc. And they are roomier, have more power, get better gas mileage, and have backup cameras and Apple CarPlay!
- A dozen eggs is up over 350%, but a 6-pack of Budweiser is only up 60ish% since 2000. So you know, it all balances out. Maybe drink a few extra cans of Bud with your next meal.
- I was really excited about the idea of a modern statically typed language with green threads ala Erlang / BEAM. I lost interest when Rust moved away from that direction and became focused on zero-cost abstractions instead.
Maybe the messaging should be "eat healthier"? How many obese people cook for themselves and eat exclusively from the outer aisles of the grocery (fruits, vegetables, seafood, meat, eggs, dairy)?
I could be wrong but I have to imagine the average obese person has a terrible diet. Portion control won't work at that point, you're already doomed to fail.
To be fair, most people have a terrible diet, it's just that some lucky individuals have the metabolism to overcome it. It seems like those people are increasingly the exception and a bad benchmark for how humans should eat.