Deep Fakes: Technology with Hazardous Potential

Deep Fakes: Technology with Hazardous Potential

What’s Deepfake in the First Place?

If you’ve come across this term for the first time, let us tell you that deep fakes are videos or content pieces that are fake but made up to look unquestionably real. Deepfakes use a form of artificial intelligence called ‘deep learning’ very much like photoshopping to create pictures of fake events.  That’s why such pieces are called ‘deep fakes’.

Curious about the exact process, here’s a simple explanation – most deepfakes are created when several pictures of an individual are processed using a computer algorithm, to produce a moving dummy face. For example, you can put US ex-Prez Obama’s images from different angles and the algorithm will get you a dummy Obama face that will lip-sync anything. To complete the process, his voice can be simultaneously synthesized. So, it looks and sounds like him and can be made to say anything (like Donald Trump is a dipshit, remember?) that you would like.

Deep Fakes: Technology with Hazardous Potential

Honestly, for entertainment, deepfakes can be really funny and engaging. However, it can be highly detrimental if used incorrectly.

Where is it Used Mostly?

Currently, it’s the adult films industry. It’s the same technology that’s used on pornographic sites to make it look like leaked sex-tape of mainstream actors (Oh, you didn’t know those were fake? Poor you.)

In September 2019, the AI-based company named Deeptrace discovered 15,000 deepfake videos online, nearly doubling over nine months. 96% of those were mapped from female celebrities onto the faces of pornstars. The worst part is that you don’t need to be a core techie for making them. The modern technology allows even the most unqualified to make deepfake technology videos that have capabilities of fuelling a revenge porn.

Deep Fakes: Technology with Hazardous Potential

So is it Just Videos that Can be Deep Fakes?

No. This technology can generate compelling yet fully fictional images from scratch. You will be surprised to know that audio can be deepfaked too. Fraudsters and scammers have conveniently used “voice masks” or “voice copies” of public figures to successfully cause treachery. In fact, fake voice notes of famous celebrities and politicians in the name of ‘call-tapping recordings’ have been circulated over Whatsapp to induce hate crimes

How Can we Identify Them?

As the deepfake technology is growing, it’s getting harder. US scientists discovered in 2018 that deepfake faces do not usually blink. There’s no surprise: most pictures used have people with eyes open, and the algorithms never really learnt to blink. At first, this became the chief identification. As soon as the studies went public, deepfakes with blinking were revealed. That is the essence of the game: weakness is fixed as soon as it is revealed. However, if you’re mindful, you can still be accurate to judge.

Poorly edited deepfakes are easier to spot. Check the following first:

  1. Berserk lip-synching
  2. Patchy skin tone
  3. Flickering around the edges of transposed faces.
  4. Hair; it’s hard for deepfakes to render hairstyles well, especially where strands are visible across the fringe.
  5. Badly rendered jewellery
  6. Teeth
  7. Eyes; inconsistent illumination and reflections on the iris.

For the perfect ones, you will need an eagle’s eyes.

Deep Fakes: Technology with Hazardous Potential

In many deepfake videos, you will find that the person is not looking directly at the camera. Authentic video capture people with their face moving in three dimensions, but deepfake algorithms are not yet equipped to develop faces in 3D. Instead, they generate a regular two-dimensional image of the face and then try to rotate, resize, and distort that image to make the edits. In most cases, the face of the person is not aligned. You can make that out by comparing it to any actual photograph. You will see something ‘just not right’ instantly.

In the end, it is important to confirm anything with a trusted source before putting any video to use. Remember that one wrong mass broadcast can instantly ruin a person’s life or hamper the peace in the society. Be vigilant, be smart.

You May Also Like