Exposed: OpenAI’s KYC Partner Persona Accused of Sending ChatGPT Users’ Crypto Addresses to US Federal Agencies
Exposed: Accused of Sending ChatGPT Users’ to
Imagine uploading your passport photo and a selfie just to chat with an AI. Now picture that same data, plus your crypto wallet addresses, landing straight in the hands of government agencies. This is the shocking claim against Persona, the company behind OpenAI’s know-your-customer (KYC) checks for advanced ChatGPT features.
A recent deep dive by security researchers uncovered code that allegedly pipes user info to FinCEN, a US Treasury bureau that tracks financial crimes. This raises big red flags for privacy, especially in the crypto world where staying anonymous is key.
What Is KYC and Why Does OpenAI Need It?
KYC means “know your customer.” It’s a process where users prove who they are with ID documents. Banks and exchanges use it to fight money laundering and terrorism funding. OpenAI started requiring KYC for heavy users of its top AI models to stop abuse.
When you verify, you snap a photo of your ID, take a selfie, and maybe record a quick video. Persona handles this for OpenAI. They check your face against the ID, scan for sanctions, and flag risks. Sounds routine, right? But the real issue is what happens next.
The Bombshell Investigation: Code That Talks to the Government
Security experts dug into Persona’s systems and found public code doing more than basic checks. It reportedly sends data to FinCEN for Suspicious Activity Reports (SARs). These reports help agencies spot dirty money.
Key findings include:
- Direct links to US government domains for filing reports.
- Tools to tag data with secret intelligence program codes.
- Integration with Chainalysis, a blockchain tracker that scans crypto addresses.
- A “watchlist” for crypto wallets that monitors them forever.
- Over 250 extra checks, including links to Canada’s financial intel unit.
Chainalysis digs into your wallet: who it talks to, how much money moves, and even tries to name the owner. Once on the watchlist, it’s game over—no escape from constant scanning.
The code has been live since November 2023. Users might not know their selfies and crypto links are fed into this machine.
Persona’s CEO Responds—But Does He?
Rick Song, Persona’s CEO, hit back on social media. He said researchers didn’t contact him first and that his company doesn’t work with federal agencies today. Emails show frustration, but no denial of the code itself.
One deleted post called it disappointing. Experts say the research holds up—government sites exist, and the setup looks real. Still, questions linger: Why the code? Who flips the switch? What’s the trigger for sharing data?
Crypto Privacy Under Siege: Why This Hits Hard
Crypto was born from cypherpunks—folks fighting surveillance with tech like Bitcoin. They dreamed of money without Big Brother watching. Now, KYC creeps into AI chats, linking your face to your funds.
Risks are huge:
- Data Breaches: KYC firms hold goldmines of info. Hacks leak millions of records yearly.
- Secret Watchlists: No clear rules on who gets flagged. One wrong link, and you’re tracked.
- Biometric Lock-In: How long is your face stored? OpenAI says up to a year. Code hints at three years or permanent for IDs.
- Slippery Slope: Today ChatGPT, tomorrow your email or social apps?
Platforms push KYC for safety, but critics argue it creates more harm. Governments love the data flow, building vast surveillance nets without oversight.
Chainalysis in the Mix: Blockchain’s Double-Edged Sword
Chainalysis helps cops chase criminals on blockchains. It’s busted scams and traced ransomware. But when tied to everyday KYC, it feels like overreach. Your casual wallet check could flag innocent trades.
The investigation notes a “persistent monitor.” Add a wallet once, and it’s scanned endlessly against Chainalysis graphs. No opt-out, no warning.
Bigger Picture: The Death of Digital Privacy?
This isn’t isolated. More apps demand ID: social media, gaming, payments. Each adds to the data pile agencies crave.
Crypto users feel it most. Wallets were pseudonymous havens. Now, one AI signup could dox you. Think politicians, activists, or just privacy fans—everyone’s exposed.
Solutions? Use privacy coins, mixers, or decentralized ID. But mass adoption lags. Regs like Europe’s MiCA tighten KYC screws further.
What Should Users Do Now?
- Think Twice: Skip KYC if possible. Use basic ChatGPT tiers.
- Go Private: Fresh wallets for KYC-linked services. Avoid main holdings.
- Stay Informed: Watch for OpenAI or Persona updates.
- Push Back: Demand transparency on data use.
Neither OpenAI nor Chainalysis commented yet. Until they do, caution rules.
Final Thoughts: Wake-Up Call for Web3
This scandal spotlights the clash: innovation vs. control. AI and crypto promise freedom, but KYC strings pull toward surveillance. As code leaks show, trust no one blindly. Verify, protect, and fight for privacy.
The crypto community buzzes. Will this spark change? Or normalize data grabs? Stay tuned—this story’s just heating up.
Keywords: OpenAI KYC privacy, Persona data leak, crypto surveillance, FinCEN ChatGPT, Chainalysis watchlist
Discuss this news on our Telegram Community. Subscribe to us on Google news and do follow us on Twitter @Blockmanity
Did you like the news you just read? Please leave a feedback to help us serve you better
Disclaimer: Blockmanity is a news portal and does not provide any financial advice. Blockmanity's role is to inform the cryptocurrency and blockchain community about what's going on in this space. Please do your own due diligence before making any investment. Blockmanity won't be responsible for any loss of funds.
















