Nonconsensual deepfakes featuring celebrity likenesses, whether visual or voice, are likely to proliferate on major social platforms as generative AI tools become more publicly available. Because their data are prolific and easily accessible online, celebrities are likely to be among the first public victims of nonconsensual deepfakes. Over the past year, more instances have emerged of celebrity visual and voice likenesses being used without permission in targeting consumers.
Such scams have tended to manipulate likenesses of trusted celebrities to falsely promote products. Just last week, three high-profile personalities — Entertainment workers are increasingly concerned about generative AI misuse for misleading deepfakes of celebrity voices and images, according to our VIP+ survey conducted by YouGov. As of September, over 7 in 10 industry professionals are either very or somewhat concerned that generative AI will be used to create misleading voice clones or digital doubles of celebrities, up several percentage points since Jun
Entertainment Entertainment Latest News, Entertainment Entertainment Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: screenrant - 🏆 7. / 94 Read more »