People interact through smartphones during a video call, reflecting how online relationships can appear personal and authentic, even as authorities warn that scammers now use artificial intelligence and deepfake technology to mimic trust and intimacy.
As online courtship rises, Philippine authorities warn that artificial intelligence is being used to create convincing romance scams
MANILA – The Philippine National Police (PNP) has warned the public about a rise in romance scams that use artificial intelligence to create convincing online personas, including manipulated images, voice cloning, and deepfake video calls. Authorities urged greater caution as online courtship activity increases around Valentine’s Day.
PNP officials said these schemes often begin with sustained communication on social media or messaging platforms before shifting to requests for money or sensitive personal information. They noted that AI tools have made it easier for scammers to appear credible over long periods, reducing the reliability of traditional identity checks.
PNP chief Gen. Jose Melencio Nartatez Jr. said the Anti-Cybercrime Group is strengthening investigative readiness as digital fraud techniques evolve. He said personnel are undergoing training in AI detection, deepfake analysis, and digital forensics, alongside planned upgrades to cybercrime equipment. The police are also coordinating more closely with other government offices and civil society groups to improve information sharing and disrupt scam operations.
The warning aligns with advisories from the Cybercrime Investigation and Coordinating Center (CICC) and Scam Watch Pilipinas. The CICC said its national anti-scam hotline, 1326, recorded 123 formal complaints related to love scams in 2025, with reports often increasing during periods linked to gift-giving and higher online engagement.
The CICC also reported that victims recovered ₱20.1 million in 2025 and ₱1.2 million in January 2026 through prompt reporting and coordination with financial institutions. Officials cautioned that recovery is not guaranteed and depends largely on how quickly suspicious transactions are reported.
Authorities warned that video calls, once considered a basic authenticity check, are no longer sufficient on their own. Deepfake technology can convincingly simulate faces and voices in real time, allowing scammers to appear legitimate even during live conversations.
The PNP advised the public to slow down online relationships that escalate quickly, verify identities through multiple independent channels, and avoid sending money or personal information to individuals they have not met in person. Immediate reporting of suspected scams, officials said, remains critical to preventing further losses.

