• Members of the previous forum can retrieve their temporary password here, (login and check your PM).

Machine Learning Confronts the Elephant in the Room

Migrated topic.
I suppose you can expect more of these drawbacks. One of the truest signs of technological progress is that you run into obstacles. They will never be able to truly mimmick a human brain. The closer they'll get, the bigger the obstacles will become.
 
Interesting.

There is little gem in the article: psychedelic toaster attacks.

I'm not kidding. See attached. 😁

The psychedelic experience is part of human conciousness. I think it is allways in our brain, in the background to some degree or another. It helps us hallucinate a version of reality that we can understand. Maybe a good AI would need to have access to a super connected psychedelic state to be as effective as the human brain.
 

Attachments

  • 171209665.pdf
    1.6 MB · Views: 0
  • IMG_20180923_120943.jpg
    IMG_20180923_120943.jpg
    51.6 KB · Views: 0
dragonrider said:
I suppose you can expect more of these drawbacks. One of the truest signs of technological progress is that you run into obstacles. They will never be able to truly mimmick a human brain. The closer they'll get, the bigger the obstacles will become.

Mhm, completely agree. Massive hurdles tbh, and I think that the closer that they draw into the the extreme intricacies of the brain, the crazier the hurdles will become. Great stuff to shoot for though I think.

Mostly posted this because I thought it was comical when the AI started going bananas, then went back to the objects it originally had recognized and started misidentifying them. 😁

Confusin the poor thing :twisted:
 
Then the researchers introduced something incongruous into the scene: an image of an elephant in semiprofile. The neural network started getting its pixels crossed. In some trials, the elephant led the neural network to misidentify the chair as a couch. In others, the system overlooked objects, like a row of books, that it had correctly detected in earlier trials. These errors occurred even when the elephant was far from the mistaken objects.

And as for the elephant itself, the neural network was all over the place: Sometimes the system identified it correctly, sometimes it called the elephant a sheep, and sometimes it overlooked the elephant completely.

😁
 
It goes to show how different machine 'thinking' is from the human brain.

I'm impressed with Facebook's facial recognition in auto tagging, but then other tasks, which may seem much simpler to us, pose a big challenge to machines.

The only time Facebook didn't recognize a face correctly for me was when I posted a photo of a monkey. It recognized it as one of my friends :lol:
 
Back
Top Bottom