How Artificial Intelligence Is Making Romance Scams Even More Dangerous

Most of us believe we’ll never fall for a scam. We think we know the signs, like a poorly worded message that sounds urgent. Unfortunately, social engineering —a tactic that plays on human emotions and instincts, leading us to act against our own interests—can work on anyone.
Romance scams are a classic example of emotional manipulation, in which the perpetrator exploits feelings of loneliness, love, or a desire to connect with the victim to gain their trust for the long term. Beyond the emotional impact, the financial consequences are also significant: the FBI’s Internet Crime Complaint Center (IC3) reported losses from romance scams of $672 million in 2024, a figure that is almost certainly only a fraction of the true amount.
Fraudsters are increasingly using artificial intelligence tools in romance-related scams, making such campaigns even more difficult to detect and, therefore, even more dangerous for victims. Experian predicts that AI-powered romance-related scams will be among the top fraud threats in 2026.
How Romance Scams Work
As McAfee describes in its recent report on the state of romance scams , this type of scam is a long-term scam. Romance scams typically begin with a “hook,” such as a private message, a request to subscribe, a message with a “false number,” or a match on a dating app. Once the scammer receives a response, they move on to “love bombing,” attempting to quickly establish intimacy and trust while convincing you to keep the relationship secret. They’ll take time to establish the impression that their identity likely includes a job or lifestyle that prevents them from meeting you.
What follows is a minor request for financial support, which can escalate into opening an account, “investing” in a business project, or guaranteeing a loan. Increasingly, these schemes involve fraudulent cryptocurrency investments . (Another term for this is “pig slaughterhouse.” ) Once they’ve obtained their desired outcome, the scammers disappear, leaving the victims to deal with the consequences.
Romance scams work because they don’t start with obvious exploitation. Scammers build trust over weeks and months, so victims have a better chance of feeling like they’re in a real relationship than being scammed until they find themselves in a very difficult situation.
Artificial intelligence is making romance scams worse.
Artificial intelligence is making romance scams even easier for scammers. In a review of recent research, Bitdefender notes that to gain trust, scammers traditionally had to devote significant time and attention to each individual victim. While this long-term strategy often pays off—as the benefits are often significant—it limits the number of potential victims a single scammer can reach.
Artificial intelligence removes these barriers. Large language models (LLMs) can support natural dialogue without the hallmarks of fraud, such as poor grammar and spelling errors. AI can capture personality, emotion, and tone, and is less likely than humans to appear pushy or rushed. Chatbots can store and integrate personal data from previous conversations, and they require very little effort to operate, even when serving large numbers of victims.
Automated chatbots are particularly adept at tackling the early stages of romance scams, requiring human intervention only at critical moments to calm the victim or initiate a financial request. Because scammers can conduct multiple conversations simultaneously, they can also test different tactics and quickly refine them based on what works best to keep victims engaged. As the Global Cyber Security Alliance notes , AI adds “speed, scale, and stability” to traditional romance scams.
Research shows that victims may actually view AI as more trustworthy than humans. McAfee found that a third of American adults believe it’s possible to develop romantic feelings for an AI-powered bot. The use of deepfakes in audio and video footage makes these AI-powered scams even more believable, as victims can no longer rely on the scammer’s refusal to engage in real conversation as a red flag.
How to Spot a Romance Scam
Even a well-trained chatbot has its limitations . According to McAfee, the most common signs of interaction with a bot or fake profile include memorized or repetitive responses, snappy (and perfectly crafted) responses, and photos clearly generated by artificial intelligence. Other warning signs include avoidance of voice and video calls, as well as unusual requests early in the conversation.
To avoid falling for scammers using artificial intelligence for dating, take your time. Be wary of carefully crafted responses, which may indicate automation. Try asking unexpected questions or creating conflicting situations that could confuse the chatbot. Remember that relationships shouldn’t be based on secrecy or dependent on financial support. Social media and dating sites are full of fake profiles, so a picture is worth a thousand words.