On 25 May, Tesla CEO Elon Musk responded to a deepfake video of him supposedly endorsing a fake crypto project. In a late posted tweet, the centibillionaire expressed that it was “without a doubt” not him.
The deepfake video, which became a web sensation on Twitter recently, expects to draw financial backers into putting resources into an exchanging stage that offers 30% profits from crypto stores.
Many Dogecoin supporters complained about the video’s poor quality, stating Musk’s speech sounded robotic and incoherent. “This is a bad deep fake,” said one Twitter user.
Billy Markus, co-founder of Dogecoin, believes that anyone foolish enough to fall for such a hoax deserves to lose their money.
A deepfake is a piece of manufactured media that has been convincingly modified to cause somebody to do things that didn’t really occur, all things considered.
Making great clones requires costly computational assets. They are typically made with the assistance of a generative adversarial network (GAN).
With deepfakes turning out to be increasingly refined, there’s a dire requirement for dependable discovery instruments that can be utilized to detect controlled media.
Scams in the cryptocurrency sector are common, especially those on Twitter that promise big returns. A phishing scheme targeted verified accounts in March to steal over $1 million in a phoney airdrop of ApeCoin, the native token of the Bored Ape Yacht Club NFT collection.
Read more:
Co-founder Dogecoin slams Elon Musk for faking to admire Dogecoin