No, computation is algorithmic, real machines are not necessarily (of course, AGI still can't be ruled out even if algorithmic intelligence is, only AGI that does not incorporate some component with noncomputable behavior.)
Author seems to assume the latter condition is definitive, i.e. that real machines are not, and then derive extrapolations from that unproven assumption.
As the adjacent comment touches on are the laws of physics (as understood to date) not possible to simulate? Can't all possible machines be simulated at least in theory? I'm guessing my knowledge of the term "algorithmic" is lacking here.
Quantum mechanics is even linear!
Fun fact, quantum mechanics is also deterministic, if you stay away from bonkers interpretations like Copenhagen and stick to just the theory itself or saner interpretations.
Also, one might argue that universe/laws of physics are computational.
Maybe we need to define "computational" before moving on. To me this echoes the clockwork universe of the Enligthenment. Insights of quantum physics have shattered this idea.
Not at all. Quantum mechanics is fully deterministic, if you stay away from bonkers interpretations like Copenhagen.
And, of course, you can simulate random processes just fine even on a deterministic system use a pseudo random number generator or you can just connect a physical hardware random number generator to your otherwise deterministic system. Compared to all the hardware used in our LLMs so far, random number cards are cheap kit.
Though I doubt a hardware random number generator will make the difference between dumb and intelligent systems: pseudo random number generators are just too good, and generalising a bit you'd need P=NP to be true for your system to behave differently with a good PRNG vs real random numbers.
Which is certainly an opinion.
> whatever it is: it cannot possibly be something algorithmic
https://www.hackerneue.com/item?id=44349299
Maybe OP should have looked at a dictionary for what certain words actually mean before defining them to be something nonsensical.
Making non-standard definitions of words isn't necessarily bad, and can be useful in certain texts. But if you do so, you need to make these definitions front-and-centre instead of just casually assuming your readers will share your non-standard meaning.
And where possible, I would still use the standard meanings and use newly made up terms to carry new concepts.
> humans can (somehow) do this
Is this not contradictory?
Alternatively, in order to not be contradictory doesn't it require the assumption that humans are not "algorithmic"? But does that not then presuppose (as the above commenter brought up) that we are not a biochemical machine? Is a machine not inherently algorithmic in nature?
Or at minimum presupposes that humans are more than just a biochemical machine. But then the question comes up again, where is the scientific evidence for this? In my view it's perfectly acceptable if the answer is something to the effect of "we don't currently have evidence for that, but this hints that we ought to look for it".
All that said, does "algorithmically" here perhaps exclude heuristics? Many times something can be shown to be unsolvable in the absolute sense yet readily solvable with extremely high success rate in practice using some heuristic.