• Members of the previous forum can retrieve their temporary password here, (login and check your PM).

Single neuron computational complexity

Migrated topic.
Thanks for sharing Loveall :)

To find out, David Beniaguev, Idan Segev and Michael London, all at the Hebrew University of Jerusalem, trained an artificial deep neural network to mimic the computations of a simulated biological neuron. They showed that a deep neural network requires between five and eight layers of interconnected “neurons” to represent the complexity of one single biological neuron.
Timothy Lillicrap, who designs decision-making algorithms at the Google-owned AI company DeepMind, said the new result suggests that it might be necessary to rethink the old tradition of loosely comparing a neuron in the brain to a neuron in the context of machine learning. “This paper really helps force the issue of thinking about that more carefully and grappling with to what extent you can make those analogies,” he said.
They continued increasing the number of layers until they achieved 99% accuracy at the millisecond level between the input and output of the simulated neuron. The deep neural network successfully predicted the behavior of the neuron’s input-output function with at least five — but no more than eight — artificial layers. In most of the networks, that equated to about 1,000 artificial neurons for just one biological neuron.
The authors also hope that their result will change the present state-of-the-art deep network architecture in AI. “We call for the replacement of the deep network technology to make it closer to how the brain works by replacing each simple unit in the deep network today with a unit that represents a neuron, which is already — on its own — deep,” said Segev. In this replacement scenario, AI researchers and engineers could plug in a five-layer deep network as a “mini network” to replace every artificial neuron.

Glad people aren't entirely entrenched in old paradigms and are able to continually push the bounds of understanding in this/these area/s.

Good luck
 
Thanks for sharing that awesome article Loveall!

I couldn't stop thinking about Hofstadter's description neural structures and "symbols" in Godel Escher Bach. Reminds me that I still need to read Fluid Concepts and Creative Analogies, also by Hofstadter.

One love
 
Hmm, yeah. I always thought these so-called "AIs" were a bit crap in terms of efficiency. When they go head to head against a human, their power input should also be limited to that of a human brain. It's similar to how steroids and the like are banned in sport.

That's not to detract from the remarkable results that neural networks can provide, but more of what I consider to be a relevant and perhaps crucial goal in this field where ever-increasing dependence on computing needs to be balanced against its energy requirements from the perspective of climate change/chaos. Even if powered by renewables, these vast devices are dumping tons of waste heat into the atmosphere.
 
Back
Top Bottom