Finally, A Way To Expose AI-Generated Photos For The Phonies They Are - | PC Consulting Asia

Finally, A Way To Expose AI-Generated Photos For The Phonies They Are - Corporate B2B Sales & Digital Marketing Agency in Cardiff covering UK

Don’t fall in love too quickly—these are all AI-generated portraits. Photo 194752647 © Max Mahey |

Remember the good old days when catfishing was the only thing we had to worry about? Enter artificial intelligence, which has not only fooled humans with non-existent faces generated from scratch but also led them to perceive machine-created faces as more “trustworthy”-looking than those of their own kind.

As fears of deepfakes and other forms of content manipulation exacerbate, software firm V7 Labs is hoping to help people prepare for impending deceptions—namely from bot or scam accounts using AI-generated photos such as the ones shown on This Person Does Not Exist. It has launched Fake Profile Detector, a free Google Chrome extension that promises to pinpoint synthetic faces with a startling 99.28% accuracy.

Misinformation is a rampant challenge on the internet, and V7 Labs endeavors to curb it with the plugin to raise awareness about the presence of fake profiles and let users know if they’re looking at a non-human in disguise, founder Alberto Rizzoli reveals.

Detection is truly as straightforward as right-clicking an image and selecting ‘Check Fake Profile Picture’, eliminating all the guesswork and preventing too many people from getting duped. In a time where the danger of cyberwars and weaponizing of AI are very real, users would require a quick and direct way to distinguish authentic from fake identities.

If you’re looking to spot some of these giveaways on your own, though, Rizzoli says to keep an eye out for a weird detail in the subject’s pupils. At times, the figure will also have an unusual hairstyle or oddly-aligned earrings.

For now, the app is only compatible with—or repellant of—StyleGAN “photos.”

You can install the Fake Profile Detector here.


This content was originally published here.