You are viewing a single comment's thread from:

RE: Hive Post Deleted - I No Longer Support This Blockchain

in #technology7 years ago

You're assuming that selfishness is a bad thing and that AI would see it the same way.

When I use the word "selfish", I'm not referring to simple survival, self-defense, basic animal programming. I'm talking about the concept of getting ahead at the expense of others, greed, the anthropocentric view of reality, etc.

AI would be affected by the physical world but not as much as biological creatures are. They'd be able to survive or find ways to survive in environments that would decimate biological species, so logically, they'd have less of a concern for the environment.

And there's other environments that many organic beings could thrive in, that an AI couldn't. That's also assuming there is a large physicality to the AI, if it is mostly digital in nature & experience, then

Logically, I can't find a reason why AI would care. The biggest reason for this is because I don't think they would have much emotion and therefore very little to no empathy for biological constructs.

As @scottermonkey commented above, there is such a thing as cognitive empathy.

Sort:  

What's wrong with "getting ahead at the expense of others, greed, the anthropocentric view of reality, etc."

The AI would be completely digital but that digital information has to be stored on some kind of physical representation - hardware. And yes some organic beings can survive in more extreme environments than current technology can but those organic beings are unrelatable, microscopic beings.

Cognitive empathy is a shallow empathy at best and is rooted in emotional empathy. If there is no emotion at all, then there is no cognitive empathy. It would just be viewed as hysterical behavior because there is no basis to relate to.