stcredzero_at_sign_g_mail_dot_com
- Someone made a pi version of the OQO!
- How was he caught?
- If their art dies out, maybe nobody will know how bad all the pianos are. And then we'll all have slightly worse pianos than we would otherwise have. And I mean if that's the way things are going to go, then let's just steer the Earth into the Sun, because what's the point of any of this.
I think a similar thing happened to journalism ethics over the course of the 20th century up through the 1st quarter of the 21st.
The XKCD counterpoint: https://xkcd.com/915/
(I think this shows how arrogant Randall Munroe can be sometimes. He does a lot of great stuff, but when he's wrong, he's egregiously so!)
- The sphere for all liquid water seems to be close in size to the asteroid Ceres.
https://lightsinthedark.com/wp-content/uploads/2013/06/ceres...
- Instead of just downvoting me, how about doing some actual research:
Proof that Tesla demand is up in July 2024: https://www.youtube.com/watch?v=dHiAIZsXT1Q
- > but are not executable.
Fixed it for you.
Dude, if you say the flow in the diagram is not executable, blanket in any fashion, then are you saying all of the programming projects you've been in are either monolithic systems, or have all failed?
- I'm very disappointed that (Number four will shock you) wasn't some kind of break statement or event handling.
- Bret Victor might argue visualizing a program is still "drawing dead fish".
The power of visual programming is diminished if the programmer aims to produce source-code as the final medium and only use visualization on top of language.
I disagree. We frequently break up large systems into chunks like modules, or micro-services, or subsystems. Often, these chunks' relationships are described using diagrams, like flowcharts or state transition diagrams, etc.
Furthermore, quite often there are zero direct code references between these chunks. Effectively, we are already organizing large systems in exactly the fashion the op is proposing. Inside each chunk, we just have code. But at a higher level viewpoint, we often have the abstraction described by a diagram. (Which is often maintained manually, separate from the repo.)
What exactly are the disadvantages here?
- Sounds like the consumer tech version of "$70k EVs aren't selling anymore!"
Misinformation. The good EVs are selling quite well. It's just that their price has effectively dropped quite a lot. My wife's Model Y which cost us nearly $80k (we bought at peak price: the prior Corolla got totalled) now has the equivalent 2024 model selling for $42k!
The crappy EVs (ie most everyone else's) aren't selling, because they are inferior in efficiency and software implementation. Rivian and Lucid vehicles are pretty good, but those companies are still at risk of never showing a profit. I've been in a Hyundai Ionic 5, and that seemed decent too.
- Or, maybe it isn't Apple-specific; maybe there simply isn't enough users, or VR/AR as a software paradigm is currently too far removed from how companies design their applications, and it is just a matter of time until they adapt. Like I said, I am not a developer, so maybe I'm missing something obvious here.
Long term, here's what I suspect may happen. Robotics+AI is going to eat the lunch of VR. VR is only used when it's not economical to have the actual stuff, but the realm of nifty real world stuff is going to expand tremendously.
This may well result in a societal bifurcation, where the rich have a bunch of robots, and the poor have to settle for VR.
- An enormous percentage (like 90%+) of requests to Hubble, JWTC, etc get denied, so the market is seemingly there.
What is the TAM? Would it be worth it?
However, considering that the Hubble was developed from spy satellite tech, could this also happen in reverse? What uses would a 9m mirror telescope have in terms of ground observation? Would the US government allow such a thing to go up and be available for hire?
- What would be the effect on the "Grabby Aliens" model?
- Beyond some specific use cases, they don't seem to scale cognitively. The tangle of connections is a problem.
Tons of things like this were built in Smalltalk. (Including a UI->Domain model connection layer in the IBM VisualAge Smalltalk IDE.) They all had scaling problems, especially, "they don't seem to scale cognitively."
It's not as if the problem doesn't exist in most codebases. It's more that the problem is invisible without such tools. Tools making the tangle visible make themselves seem unusable.
The fundamental problem, is that we don't have ways of introspecting these horrendous relationship graphs for specific contexts. If IDEs and other programming tools generally could create custom browsers/IDE windows for things based in queries like:
"All of the methods that contain references to ClassA.Member1 and ClassB.Member2 which also call function Y."
...where this query can be modified or further specified at runtime. Then there could be specific built in queries that cover everything touched by canned refactorings. Then these further could be intersected or unioned.
EDIT: Forgot to complete my thought. If the graphical diagram could show contextually relevant slices of the system, it would greatly cut down the confusing web aspect of the diagrams.
- What I've found is that many times, people like the perceived confidence that obstinacy can bring.
The problem with that method of evaluation, is that it's not First Principles. Basically, pg's essay in this case just reduces down to, "Is that person steered by First Principles thinking?"
- > All the toxins in the food chain end up in the apex predators
Is there an analogy for media and information?
- > > local
> For now.
Like everyone's music and movie collections.
- Nobody will be able to compete against Apple CPUs while they're 1 process node ahead of competitors though.
If this artificial advantage allows Apple to slack off in their other areas of competitive advantage, then this is bad for the consumer, overall. This creates an environment of cynicism, where there's even degraded incentive to try and beat Apple with a better product.
- TSMC being the only game in town is what's bad for the market.
Beyond some threshold, any meta-gaming of the market is bad for the market. It's like patents. When they were first instituted, they did some good by incentivizing innovation. Then, people started meta-gaming them, and used them as legal weapons to suppress competition well beyond the original intention.
I don't think there's a good argument that monopolizing a feature is good for the market, even if the means that allows it is technically legal.
What could be argued, is if legal remedies to this kind of meta-gaming might also be meta-gamed in return and become worse than the thing they were trying to remedy. This seems to always happen in a variety of forms.
EDIT: Finding more evidence for convergence between scientific fields is also worthy. (Though the delta is very small at this point.)