The swelling scare of deepfakes
Deepfakes of public figures you know are all over the internet these days deceiving people and causing more problems than the technology was meant to solve. Val9ja will discuss this situation shortly in this article and provide you with few necessary information on how to deal with the frightening trend.
Superstars like Lionel Messi and few from Hollywood have allowed their likenesses and voices to be used in making deepfakes after signing deals with companies. Messi for example sold his image to PepsiCo for use in their advertisements, with no room for buying it back.
Let us look at the uses of deepfakes before we go into how to be safe from fake videos and audios manufactured by fraudsters and several other pretenders who hope to mislead or to rake in illegal money.
Deepfakes, which are highly realistic and often deceptive video or audio content created using artificial intelligence, have both positive and negative applications. Here are some potential good uses of deepfakes:
They can be used in the entertainment industry to create realistic scenes and special effects. They can also be used for dubbing and voice acting, enabling the creation of content in multiple languages without the need for multiple actors.
Deepfakes can be used to create realistic simulations for training purposes, such as medical simulations for doctors and nurses, or simulations for law enforcement and military training.
Artists and creators can use deepfakes as a medium for expression, pushing the boundaries of digital art and storytelling.
Artificial intelligence expert and professor emeritus at the University of Washington told CBS News recently that he was afraid of the bad role deepfake is likely to play in the coming United States presidential election.
‘I expect a tsunami of misinformation,’ he said. ‘I can’t prove that. I hope to be proven wrong. But the ingredients are there, and I am completely terrified.’
‘You could see a political candidate like President Biden being rushed to a hospital,’ he said. ‘You could see a candidate saying things that he or she never actually said.
‘You could see a run on the banks. You could see bombings and violence that never occurred.’
Recently, former chairman of Tata, Ratan Tata’s image was used in a highly misleading video that showed the enterpreneur giving business advice.
The caption on the video read, ‘A recommendation from Ratan Tata for everyone in India. This is your chance to exaggerate your investment right today risk-free with 100% guarantee. Go to the channel right now.’
There were also WhatsApp videos alleging he promised a particular amount of money to a cricket player from Afghanistan called Rashid Khan.
Mr. Tata posted the following on X later:
‘I have made no suggestions to the ICC or any cricket faculty about any cricket member regarding a fine or reward to any players.
‘I have no connection to cricket whatsoever. Please do not believe WhatsApp forwards and videos of such nature unless they come from my official platforms.’
How can you be safe from deepfakes?
To be safe from deepfakes, you can follow these measures:
Post only little of your personal information online: reduce the size of personal information you provide on social media platforms and other online platforms. The less information available, the harder it is for someone or robots to create a convincing deepfake.
Verify the authenticity of media: When you see a video or image that seems wierd or too good to be true, verify the content first. Look for any signs of manipulation or inconsistencies that may indicate a deepfake.
Stay informed about deepfake technology: teach yourself about deepfake technology and how it works. Finding out the methods used to make deepfakes prepares you better to identify and protect yourself from them.
Use reliable sources: Yes, rely on trusted spots for news and information. If you can, do not share or spread content that may be manipulated or altered.
Enable two-factor authentication: Try to do this on your online accounts as it would add an extra layer of security. This can help hinder illegal access to your accounts, reducing the risk of someone using your identity for deepfake purposes.
Be skeptical of unsolicited media: If you receive a video or image from an unknown source, be suspicious and avoid opening or sharing it. Deepfakes are often used for bad purposes, such as spreading misinformation or conducting phishing attacks.
Use deepfake detection tools: There are various deepfake detection tools available that can help identify manipulated media. Consider using these tools to verify the authenticity of videos or images.
The AI is a good breakthrough our generation should be happy about. Deepfakes would not wipe the smile off our faces if we continue to learn how to avoid being fooled by them.