You are viewing a single comment's thread from:

RE: Steemit as a Stigmergic Artificial General Intelligence with Human Values

in #steemit8 years ago (edited)

I define AGI as the ability to solve a variety of complex problems in a variety of complex environments. A recursively self-improving AGI must be capable of introspection, a sign of "self-awareness." I called the system a Stigmergic AGI because I think calling it a general-purpose self-optimizing swarm intelligence can be misleading, since I am operating under the assumption that stigmergy is a indirect coordination mechanism found in even single unit systems such as in the human brain.

Whether or not AGI is theoretically possible or not, I'll try to cover my thoughts in another post. We already know that general intelligence is possible.