Preferences

> I’m skeptical that biological systems will ever serve as a basis for ML nets in practice

There is no fundamental difference between information processing systems implemented in silico vs in vivo, except architecture. Architecture is what constrains the manifold of internal representations: this is called "inductive bias" in the field of machine learning. The math (technically, the non-equilibrium statistical physics crossed with information theory) is fundamentally the same.

Everything at the functionalist level follows from architecture; what enables these functions is the universal principles of information processing per se. "It worked because it worked" because there is no other way for it to work given the initial conditions of our neighborhood in the universe. I'm not saying "Everything ends up looking like a brain". Rather, I am saying "The brain, attendant nervous and sensory systems, etc. vs neural networks implemented as nonlinear functions are running the same instructions on different hardware, thus resulting in different algorithms."

The way I like to put it is: trust Nature's engineers, they've been at it much longer than any of us have.


skibidibipiti
> There is no fundamental difference between information processing in silicon and in vivo

A neuron has dozens of neurotransmitters, while artificial neurons produce 1 output. I don't know much about neurology, but how is the information processing similar? What do you mean are running the same instructions?

> there is no other way for it to work

Plants exhibit learned behaviors

mr_toad
> A neuron has dozens of neurotransmitters, while artificial neurons produce 1 output. I don't know much about neurology, but how is the information processing similar? What do you mean are running the same instructions?

ANNs are general function approximations. You can get the same behaviour from a complex network of simple neurons that you get from a single more complex neuron.

skibidibipiti (dead)
uoaei OP
> how is the information processing similar?

The representational capacities are of course not the same -- the same "thoughts" cannot be expressed in both systems. But the concept of "processing over abstract representations enacted in physical dynamics within cognitive systems" is shared between all systems of this kind.

I am referring to "information processing" at the physical level, i.e., "'useful' work per energy quantum as communicated through noisy channels".

> What do you mean are running the same instructions?

The underlying physical principles of such information processing are equivalent regardless of physical implementation.

> plants exhibit learned behaviors

A good example of what I mean. The architecture is different, but the underlying dynamics is the same.

There is a convincing (to me) theory of the origins of life[1][2] that states that thermodynamics -- and, by extension, information theory -- is the appropriate level of abstraction for understanding what distinguishes living processes from inanimate ones. The theory posits that a system, well-defined by some (possibly arbitrary) boundaries, "learns" (develops channels through which "patterns" can be "recognized" and possibly interacted with) as an inevitable result of physics. Put another way, a learning system is one that represents its experiences through the cumulative wearing-in over time of channels of energy flows.

What concepts the system can possibly represent depends on in what ways the system can wear while maintaining its essential functions. What specifically the system learns is the set of concepts which collectively best communicate (physically, i.e., from the "inputs" through the "processing" functions and to the "outputs") the historical set of its experiences of its environment and of itself.

I want to note that this discussion has nothing to say on perception, only sensation and reaction: in other words, it is an exclusively materialist analysis.

Optimization theory describes its notion of learning roughly as such (considering "loss" as energy potentials), but with the same language we could also describe a human brain, or a black hole's accretion disk, or an ant colony dug deep into clay.

References:

[1] https://www.englandlab.com/uploads/7/8/0/3/7803054/2013jcpsr...

[2] https://www.quantamagazine.org/a-new-thermodynamics-theory-o...

Parallel directions of research:

https://en.wikipedia.org/wiki/Entropy_and_life

https://en.wikipedia.org/wiki/Free_energy_principle

This item has no comments currently.