A tweet from a well-known crypto influencer has sparked a heated debate online, especially about AI, privacy, and the data we unknowingly hand over. He suggested that AI platforms, like Ghibli Art, might be collecting sensitive data, including facial images, without users even realizing it. His post went viral, raising fresh concerns about privacy in today’s digital world.
In the tweet, he wrote: “Ghibli Images is a great way for AI to capture your facial data with consent. 1st -> They took your email & IP address. 2nd -> They took your retina and fingerprint (WorldCoin). 3rd -> Facial image. Congratulations Humans…You did it!”
Naturally, this sparked a huge response, with some agreeing with the influencer’s alarmist tone while others questioned the accuracy of his claims. One user even pointed out, “So basically, if someone takes your profile picture and puts it on ChatGPT, then your facial image is recorded?? People openly post their images on social media, and that’s not dangerous??”
The influencer replied, “It’s about consent…” But is that really what’s happening? To get to the bottom of the controversy, we decided to fact-check his claims and investigate what OpenAI, the creator of ChatGPT, actually says in its privacy policy.
To really understand how OpenAI manages privacy, we sat down and carefully read through the privacy policy for their ChatGPT service. It spells out what kind of data they collect and what they do with it, which helps us see if certain claims about it make sense or not.
Here’s a straightforward summary of what we discovered:
While OpenAI collects data to improve its services, nowhere in the policy does it mention anything about capturing highly sensitive data like retina scans or fingerprints. There’s simply no mention of any AI platform secretly collecting this kind of personal information without user consent.
Let’s set the record straight about the claim that AI can grab your retina scan or fingerprints—it’s just not accurate.
The influencer also compared fingerprints and retina scans with Worldcoin, but most people don’t know that Worldcoin and chatGPT are 2 different things. Worldcoin has users’ permission for retina scans, and there is special software for that which helps users create different identities, as retinas cannot be duplicated or made by AI.
The influencer’s claims about retina scans and fingerprints don’t really add up, but that doesn’t mean we should ignore the topic of data privacy.
The real problem here is consent. People usually agree to share their data when signing up for something, but they need to understand exactly what information is being collected and how it’s going to be used.
About the influencer saying AI is secretly taking sensitive data, and storing them, that’s not quite true. Sure, AI platforms do pick up stuff like your IP address, how you use the service, and what device you’re on. But the notion that they’re quietly grabbing things like retina scans or fingerprints? That’s not the case at all.
However, this cannot be ignored, and there is the need for caution when it comes to privacy. It’s important to take a moment to go through privacy policies and be aware of what data we’re sharing. OpenAI is upfront about its data practices, and there’s no indication that they’re collecting things like retina scans or fingerprints without users’ permission.
Also Read: Could Studio Ghibli Sue OpenAI Over AI-Generated Art?