Posted by:


Post Date:

You are watching the news one day and all of a sudden, hyperbole from conversations you had with your friends the night before are being broadcasted on your tv. You were thinking you might have had too many drinks and you were still hung over, but you receive text from your friends seeing the same thing. Then, just moments later, you receive a notification on your phone. “we have detected audio and video analysis of something you were just exposed to that may have been false”. Immediately, you perform a search online and realize that a hacker group accessed a few headends in the area and were running deep fakes as a prank.

Facebook has been working on AI software to detect deep fakes and up until now only at a 68% accuracy. Deep fakes are more than just a nuisance, they change the our perception of truth. The more we question what is real the more it may not matter. The level of control marketing firms and social media companies have through psychological manipulation is already pretty scary, however, what if even more dangerous is, what if the content or tactic is real. In marketing the main difference in manipulation is if it is built from the intent to persuade or to coerce? If you are manipulating a person to make the choice they were already going to make, then maybe it’s not all that bad. However, if you have intention to manipulate through reverse psychology or intentionally or unintentionally marks something that is real, fake, what kind of repercussions does that have not only on people but on the technology.

In 2018 Pew Research Company performed a study and 51% of people, did not trust tech companies and thought they should be more highly regulated. Other studies I have seen show as high as 71%.

People trust two ways, with emotions and with belief. Belief can be absent of emotion for trust to be present, furthermore, emotion can reinforce belief or cause belief. When a person trust due to belief first, it is much harder to break that trust. If a person trust due to emotion first, one must only need to alter that emotion for the trust to be questioned.

Think about the last time you saw something at a friends house that you liked. You might had asked him about the product or maybe you saw it in action. Your friend loved it and you were sold. Later as you looked for the product, you began to see bad reviews. You were having second thoughts. But why, you saw it work, you had a person you knew that owned it, and you trusted their judgement; why after seeing a review form someone you did not know affect your decision to purchase the product? The answer; because the trust in the product began with an emotion and not a belief.

But why is this so dangerous? When presented with a decision, over 80% of our actions are to validate our decision we already have made, not in making the initial decision. If the decision we made was based on false information, the validation of that decision can begin to create a belief around that process. The more information that is given to us to either validate a truth or falsity, ultimately reinforces our belief of the initial decision not necessarily every decision that follows.

When some one begins to believe something, emotion transitions from accepting a decision to defending a position.

Not only does this effect our perception of products and news networks but people, relationships, technology, and much more. However, this is not where it starts. Technology like AI use for Deep Fakes is merely a tool but things like manipulation, destruction, force, control are all things that are not new.

Accuracy is not a percentage of truth, it is a percentage of the perception of truth. If you don’t know someone and someone else says something about them, you may or may not believe them to do from what you know about that person. You may or may not also believe something, due to what you know about the person who is giving you the information. Deep fakes work just like all other deceptive tools; the more accurate something is to the truth, the harder it is to defend the accuracy.

Needless to say, our ability to see what we want to see may be the biggest challenge of all. In the future, there will be something even more deceptive than deep fakes, and maybe possibly there already is; maybe our minds are the most deceptive thing of all.