2023 has been nothing short of a nightmare for security researchers. While the advancements in artificial intelligence (AI) have increased productivity and helped users in their day-to-day lives, the technology has also been misused by threat actors to defraud people and carry out other illicit activities. There have been many instances of hackers masquerading as eminent persons in videos. In a recent case, a deepfake video surfaced on YouTube, showing the fake CEO of Ripple convincing people to double their crypto investments. Know all about it.
Ripple CEO deepfake controversy
The crypto community has witnessed the rise of a new deepfake featuring Brad Garlinghouse, the CEO of US-based crypto solutions provider Ripple. In the deceptive video that was previously available on YouTube, the Ripple CEO urged people to invest their XRP tokens in a specified deal with a promise to double them. The video also includes a QR code that takes unsuspecting victims to a fake website, raising potential financial risks. This is just another example of the rise in XRP scams lately.
Astonishingly, the unlisted video has still not been taken down by Google, as per the reports. Concerned Redditors contacted the Menlo Park-based tech giant. Still, its Belief and Security Staff reportedly denied the request, citing that the advertisement did not violate its policies, and even asked for more information to be provided within six months.
What is a deepfake?
According to a National Cybersecurity Alliance report, deepfakes are artificial intelligence-generated videos, images, and audio that are edited or manipulated to make anyone say or do anything that they did not do in real life. Deepfakes can be used to defraud, manipulate, and defame anyone, be it a celebrity, politician, or common people. NCA said, “if your vocal identity and sensitive information got into the wrong hands, a cybercriminal could use deepfaked audio to contact your bank.”