Latest from GetReal

Deepfakes and synthetic identities are already slipping past interviews, background checks, and onboarding. Our Deepfake Readiness Benchmark Report reveals how exposed enterprises really are – and how prepared they are to detect and stop AI-powered identity attacks.

Search

News

December 2025

Hyper-realistic AI deepfakes of deceased figures are flooding social platforms, accelerating identity abuse and eroding trust in what people see online.

News

December 2025

We're constantly seeing examples of real-world social engineering attacks using GenAI-powered voice impersonation. On Friday, the FBI issued a new warning that cybercriminals are successfully impersonating senior U.S. government officials with faked voice messages and texts, often exploiting familiarity and urgency to gain trust.

Video

December 2025

As recruiting moved online, a new class of adversary followed – using AI to fabricate identities, pass interviews, and infiltrate companies at scale. In some cases, these aren’t just scammers. They’re nation-state operatives. In this investigation, we look at how deepfake candidates are already moving through enterprise hiring pipelines – and why traditional checks no longer work.

News

December 2025

Deepfakes and chatbots are now good enough that many web users don’t trust they are interacting with real people.

Report

December 2025

Identity Manipulation, Synthetic Content, and the State of Enterprise Preparedness

Press Release

December 2025

New GetReal Security Research: 41% of Enterprises Surveyed Report Having Hired and Onboarded Fraudulent Candidates

Blog

November 2025

Guest Blog: Deepfakes and the Expanding Enterprise Attack Surface

Video

November 2025

The Physics Don’t Add Up: Why Sora’s “Perfect” Videos Are Still Faking Reality

News

November 2025

The BBC reports on the rise of AI-generated video “slop” and the new tricks that hide deepfakes in plain sight. GetReal’s Co-Founder Dr. Hany Farid shares how low-resolution footage and compression can mask synthetic distortions — and why detecting truth is getting harder.

Video

October 2025

Deepfake technology is evolving rapidly, making detection increasingly difficult. We expose how AI-generated face swaps, lip-sync deepfakes, and voice clones have become nearly impossible to spot, even for trained eyes.

00:23:00

Blog

October 2025

A Guide for CISOs and CIOs: How to Plan Your Budget for Deepfake AI Detection in 2026

News

October 2025

OpenAI’s Sora Makes Disinformation Extremely Easy and Extremely Real

Blog

September 2025

Real-Time Deepfake Protection: Paving the Way to Continuous Identity Protection

Press Release

September 2025

GetReal Security Advances Continuous Identity Protection to Combat Deepfakes

News

August 2025

Hany Farid joins NBC News to unpack the controversy around Will Smith’s concert footage, where fans accused him of using AI to fake cheering crowds.

Press Release

August 2025

GetReal Security has been named the Real-Time Threat Mitigation Winner in SiliconANGLE’s 2025 TechForward Awards.

Video

August 2025

In a thought-provoking episode of Particles of Thought — a new video podcast from the producers of NOVA hosted by astrophysicist Hakeem Oluseyi — Hany Farid explores how we can separate truth from deception and what the future of AI might look like.

1:25:01

Podcast

August 2025

Jim Brennan, our Chief Product and Technical Officer, joins The AI Forecast to break down one of the biggest challenges in today’s world: securing digital trust. He and host Paul Muller explore why trust is the backbone of modern business and how advancements like deepfakes, impersonation attacks, and AI-powered deception are shaking that foundation

Whitepapers

August 2025

Exposing Synthetic Content, Candidate Fraud & Imposters in the Talent Pipeline

Alert

August 2025

A video shared on social media with claims it shows a large demonstration “for Palestine” in Japan has been created using AI.

No resources available

Lorem ipsum

Click here to clear the filters

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.