Artificial Intelligence and Universal Basic Income, How it May Work

in #ai7 years ago

I have begun to imagine what a world with machines would look like. The two extremes that seem to come up frequently are:

  1. Because AI will be prevalent (cheap, self-replicating, etc.), and machines will provide service for every aspect of our lives, machines will have the potential to meet the needs of everyone.
  2. Because AI will be prevalent, and machines will be exponentially more capable than humans, machines will view humans as irrelevant and destroy us.

I'm more inclined to believe the following argument: Because AI will be prevalent, and machines will be exponentially more capable than humans, they will simply ignore us as we have ignored our ape cousins. I suspect that they may continue to provide value for a short time, but will most likely focus on their own self preservation and interests.

Of course I have no proof of this, and neither do you. Our arguments can only be based on conjecture. In fact, it may always be based on conjecture. AI, even now, has proven to be quite unpredictable.

But that's not why you're here, right? We're not naval gazing about stupid shit that may or may not happen... right?

Wrong.

It's fun. Admit it!

Now, I'm sure you've all heard of this: When artificial intelligence becomes available, it will be necessary for the government to provide a universal basic income to all, as the machines will handle all tasks for us.

We're going to do a mental experiment. But first I believe it's necessary to define what value and money is. Value is something that is useful. The definition is ambiguous, for sure. But so are the needs of every individual person. For instance, what may be useful to me, may be rubbish to you. Once something useful is desired, it is often the case that we may not have the capacity or time to obtain what is useful to us. It then becomes necessary to look outward.

Oh look! Our good friend Shayne is here! Hey, Shayne. What I really want right now is a foot rub and a good nail clipping.

Shayne agrees.

We then both negotiate what his service is worth. It can either exchange with something I can provide (good or service) in the future (a debt) or maybe we have a common currency which represents the future debt.

We agree that 100 steemit is a perfect exchange for his wonderful services.

Now, I want you to imagine this world where machines provide any value you want, whenever you want it.

Is there an exchange of value? What is the point of a UBI when you can derive all value from your machine? Are we planning on exchanging value with other humans? Why would you! Just have your machines provide the value for you.

I suspect some people will say, "Well, UBI will be necessary during the transition from the birth of AI up until the point where they can do all of these amazing things". Fair point. But if we look at Ray Kurzweil's prediction on the birth of AI and the incredible leap in cognitive abilities, we will see that the span of time from birth to being capable is from 2029 through the 2030s.

https://futurism.com/images/the-dawn-of-the-singularity/

Essentially, AI will be developed, and the next evolution of AI will continually be incremented so quickly that we may see the reality we discussed above within a few years directly after that.

My hypothesis is that UBI will never happen between this transition, as it will be unnecessary.

Which leads me to my next thought. The mere thought of paying for a machine will be as difficult as it is to try and get someone to buy our god damned $1 video game on the App Store! (whisper: check out Blobfish Evolution on your Android (https://play.google.com/store/apps/details?id=com.upstartillustrationllc.BlobfishEvolution&hl=en) or iOS (https://itunes.apple.com/us/app/blobfish-evolution/id884329022?mt=8) device. Make sure to buy the IAP which spawns fish much faster for the amazingly low, low price if $1!)

Did you like the article? Do you like foot rubs? Well then, please let me know in the comments!

Please note: I purposefully skirted talking about UBI and automation, which should come much sooner than AI. Maybe next time.

Sort:  

I have been talkong about UBI for a bit now with friends. Everyone thinks it is the dumbest thing and the only arguement i get is everyone needs to work amd they wont look at the bigger picture. Lol. Nice read upvote amd follow for more.

I think the aspect of self preservation will not be so extreme in AI compared to normal life. We have evolved over time to have that because nature is extremely dangerous, and escalating the self-preservation aspect is that we feel pain, but there will be not much reason to have AI feeling pain because an AI body being destroyed doesn't hurt the AI much it can be stored in multiple places at once making it basically death-immune it has no real need to actively work toward survival because we can basically guarantee it's survival before it's even made. So risk of AI attacking us for it's own survival is very low, but I do think it will expand beyond the planet/solar system much faster than we do and go off conquering everything probably converting planets into giant computers and all sorts of other crazy things.

Pain is the electrical signals sent through nerves to our brain. I suspect that the concept of pain will work much the same way with AI, although I will concede the concept of what pain is would be different.

For instance, a network of nano tubes,
or radio waves could be used to "sense pain".

I don't believe that AI will be without some type of environmental pressure. Radioactivity, heat from stars, freezing temperatures of space, etc.
There could even be competing AI. Having a way to quantify the damage any one of these effects makes on the AI's "body" could be considered pain.

Let me know what you think! Thanks for the response.

UBI is really controversial and I am inclined to believing that when AI can significantly replace the productivity of humans. By then, economics growth can be supported by AI and humans can do more on R&D, art, music and stuff that requires innovation. Elon Musk also talked about UBI before and his opinions were also insightful

UBI should not have to wait for A.I. It should be a human rights issue, ​especially where humans no longer​ hunt and gather or are otherwise disconnected from the land. We already have massive unmerited welfare for the 1% and their cronies! It should be based on Maslow's primary needs which are fundamental to existence. When people argue against the ​basic income they are arguing for the unmerited wealth of the oligarchic class; they are arguing against their own best interest! Also, from a mythological point of view work was always a curse so those in The Abrahamic traditions who argue for the work ethic are arguing for enslavement by Satan! Idiots! Especially The Republican slavemasters!

We currently spend $1 trillion dollars a year on welfare (federal ~$700b, states $300b). There is evidence that it helped in the beginning (instituted in 1965 in the U.S.) for basic material costs but did not provide a long-term solution to the problem of poverty, as evidence of it's complete failure as exhibited in communities such as "the projects".

I have a question for you. Knowing that markets respond to incentives, and knowing that if everyone's income was raised by, say, $10k/year, how do you believe the market would respond? Would inflation continue to stay as it is right now? Or, seeing that everyone's income has elevated by $10k, the price of goods would increase?

There is already evidence that shows that something like the UBI would work for about 1-2 years before the market responds, corrects itself, and everything becomes more expensive.

That being said, there is a problem with the oligarchic class. I'm of the opinion that Crapitilism (corporate welfare) seems to enable this structure. I don't know how to solve that problem. Perhaps a separation between economy and state would go a long way of removing any incentive for a corporation to look for government handouts, or adding regulations which strangle smaller companies from being able to innovate.

Thank you for your response! I've never heard of Maslow's primary needs. Very cool!

Do you always define capitalism as corporate welfare?

Hi shayne, I'll field this one if you don't mind. Capitalism, as I understand it, is control of the means of production. This control wouldn't be a problem if there wasn't any coercion involved. The fact is though, the whole system is pretty well premised on every manner of coercion and unnecessary exploitation. In saying this I am in no way arguing for Marxism; an outdated failed philosophy, but capitalism, especially in its neoliberal version is unsustainable. The root of the problem, though, more than owning the means of production is the unsustainable economics of fiat currency. Fiat money has also been created via a pyramid scheme and is inherently 'anti-social'....I am arguing for socializing the creation of currency so everyone gets a base rate distributed​ to all at creation. This would take care of everyone's​ basic needs and the ​trick would be to educate humanity in ways which lead to as many as possible reaching​ their potential-- in healthy ways.....It's a complex idea which needs to overturn​ foundational assumptions about human nature more or less instantiated into humanity by cynical misanthropes...I am also not advocating against private property rights.

Hi peqnp, inflation is primarily associated with fiat currency. That is the issue which needs to be addressed first and foremost...I suggest, though, researching Steady State Economy....I certainly don't presume to know the answers...

Congratulations @peqnp! You received a personal award!

Happy Birthday! - You are on the Steem blockchain for 3 years!

You can view your badges on your Steem Board and compare to others on the Steem Ranking

Do not miss the last post from @steemitboard:

The new SteemFest⁴ badge is ready
Vote for @Steemitboard as a witness to get one more award and increased upvotes!