Deep Fakes: Major Concern or Inconsequential Anecdote?

in #writing7 years ago

4332.jpeg

A recent phenomenon on the internet is the “Deep Fake.” Deep Fakes is the name given to videos that use machine learning technology to replace the face and / or voice of the person in a given video to make it seem like either, they are someone else, or that they are saying something they didn’t say. At first it was used in the porn community, with the seemingly harmless gag of placing someone else’s (a celebrity, say) face on a porn actress with the intent of making it seem like they (say, the star of a major movie franchise) was the one having the sex. And while that may not seem innocuous, there is an even more sinister way this tech can be used.

Fake news.

It’s already hard enough to tell whether that clever meme, or email spouting “facts” about someone is correct or not, and already we have to fact check the simplest of viral articles, but up until now, if there was “video” evidence of someone saying something, we could be pretty sure they said it. That time has unfortunately passed.

And while, at this exact date, it’s slightly harder than “the average Joe” can manage, the tech and apps built around this tech are streamlining the process and making it exponentially easier every day.

So what do we do? Is there anything we can do?

And is this something we need to worry about, or just another writer flying off the handle about some little anecdote of no consequence?



image
image @writesbackwards is a group of friends who love to write about life, sports, comedy, tech and other fun stuff!

Consider leaving a comment, we love rewarding engaging Steemians!

Sort:  

I really like your post, I did not know about deep fake videos. One thing for sure is that everything seems to be able to be faked in one way or another.

One form of cure is learning about critical thinking which helps identify fact vs propaganda.

The other is simple logic. Consider the recent news "Russia cleaned up the chemical attack" Really? You can't just sweep layers and layers of dirt up from a town and haul them away. They are generally no able to be cleaned up at all and are evident 10 years later in trace samples.

Short but good. It's nice to know more about porn sometimes :)

Who wants to read 8 paragraphs only to understand a point that can be made in 1-2? Not I.

I think we are getting desensitized to it. Whether someone says something or not seems to be meaningless. While there are more instances of fake news we also have the ability, with a little research, to find whether it's real or not. Eventually the true comes out. But by then, who cares?

"Deep fake!!"
This is a serious matter whenever it comes to what you are sharing. The world and technology all is turning into a crazy planet, changing people's dignities over the media using immoral videos

Deep Fakes is the name given to videos that use machine learning technology to replace the face and / or voice of the person in a given video to make it seem like either, they are someone else, or that they are saying something they didn’t say.

Something needs to be done other wise the world is changing into a menace.

i include this as a experience , it can happens in this age ,has effects but then again no words

It's been years now that one of my teachers told the classroom one time... "Don't believe everything you read, everything you see or what you can find on the internet". Every single day we have the responsibility to check the truth of what we share and post. Internet has a great power of communication and as every great power it has the option of being used for good or for misinformation.

Este es un tema que tiene mucha tela que cortar, en el mundo de la farándula existen incontables casos en los que se utiliza este fenómeno tecnológico para desacreditar o bajar la reputación de algún famoso. En esta era tecnológica se puede hacer de todo, hasta un programa que pueda echar por tierra este descrédito, si es que ya no existe. No estoy muy empapada en este tema de la tecnología, pero pienso que sí se puede hacer algo. Gracias por tu post @escribe espalda. Nos vemos en la cima.

This reminds me a lot of Adobe's scary audio editing application that can take a small sample of audio and then essentially allow you to make a person say something in their own voice, Photoshop for audio, an article about it here on Ars Technica. They debuted it in 2016 and have been quiet ever since, but this Adobe app combined with clever video editing and machine learning, we are entering an era where we legitimately can say that we can't or shouldn't trust everything we see or hear.

I believe techniques are being developed to detect this kind of thing, but the content isn't the problem. I think the problem is being able to stop it, as we have seen with the scarily targeted machine generated content on YouTube that is targeting kids, the problem is stopping it as quickly as it is being created.

you are right

Very informative and interesting post, @writesbackwards ... I just know this Deep Fake phenomenon, cause in Indonesia this phenomenon never been talked about...
People often said that technology makes our life better...but i think this Deep Fake technology does not like that...
Anyway, thanks for sharing... ❤😍