Join us in Chat. Click link in menu bar to join. Unofficial chat day is every Friday night (US time).
0 Members and 1 Guest are viewing this topic.
because it's to slow to build effective AI
an interesting paraphrased quote from an AI researcher I spoke with, "To program AI to solve a difficult problem, you need to solve 99% of it yourself." he is referring to the fact that the programmer solves/writes the algorithm, and the machine in the end just crunches numbers . . .
Animals (and Humans) have instincs...they don't learn everything.
hunger of what? fear of what? And once we have these things how do we resolve them in a robot brain, when we don't exactly know how they are resloved in a human brain......Seems like an over simplification to me.....
These instincts look mathematical to me. Now suppose we add modifiers to the situations. Let's say we want the robot to go pick a flower at the edge of the cliff. He'd "be afraid" but because we ordered him he would approach the flower with a set of calculations of the risk involved. He would calculate the safest speed for approaching without rolling off the cliff, backing off if vibration sensors picked up that the edge was about to fall off.
It is suposed to 'learn' to be afraid? or is that an instict we give it?