Skip to main content
Advertisement
Live broadcast
Main slide
Beginning of the article
Озвучить текст
Select important
On
Off

Fraudsters can use digital copies of deceased people created using artificial intelligence (AI) in various schemes, experts have warned about this. Such cyber threats are mainly directed against relatives and acquaintances of the deceased, and the attackers implement them through social networks and messengers. For more information about how scammers use AI copies of the dead, how dangerous such schemes are and how to protect themselves from them, read the Izvestia article.

Why are AI copies of dead people interesting to scammers

Digital copies are representations of people who live now, who lived before or never existed in reality, who were recreated in image and sound with behavioral characteristics, says Tatiana Butorina, an Al consultant and specialist at the analytical center for cybersecurity at Gazinformservice, in an interview with Izvestia. At the same time, the presentation formats of digital copies can be different, including holograms, chatbots, and VR.

Хакер
Photo: IZVESTIA/Anna Selina

— According to Gartner Inc., the global virtual identity market will grow approximately 11 times in 2021-2035, — says the specialist. — AI replicas of humans are already being used in a wide variety of fields and enjoy great success in world practice.

According to Tatiana Butorina, today fraudsters have already introduced this area into their activities, as the line between human and digital images is rapidly blurring due to the development of technology, and this creates certain opportunities for various manipulations in personal, commercial, political and public activities.

Users often do not expect threats from AI copies of deceased people, adds the data analyst of the Coordination Center of domains .RU/.Russian Federation Evgeny Pankov. People tend to trust the image of a deceased loved one or idol, so they take their messages and advice without critical evaluation. Scammers use this emotional vulnerability and the "trust effect" to get the right actions — clicking on links, transmitting codes, or transferring funds.

What schemes with AI copies of the dead have already been used?

In Russia, the use of deepfakes and AI copies of the deceased has been growing for several years - one of such scenarios, Evgeny Pankov says in an interview with Izvestia. In particular, earlier scammers used accounts of deceased people to extort money from acquaintances who might not know about their death.

Хакер
Photo: IZVESTIA/Yulia Mayorova

— There have been cases of fraud abroad using fake obituaries of famous people to generate traffic and income from advertising on websites and on YouTube, — says the specialist. — AI-generated obituaries with clickbait headlines quickly appear at the top of Google search results and are widely distributed.

What these schemes have in common is the exploitation of trust and emotions, enhanced by recommendation algorithms and personalized content, notes Evgeny Pankov. Schemes with AI copies of deceased people have been most often found in the USA, China and the UK for several years, adds Tatiana Butorina.

According to the expert, in Russia in 2024, several cases were also officially recorded when fraudsters contacted their relatives on behalf of the deceased and tried to extort several tens of thousands of rubles. At the same time, thanks to the vigilance of relatives and employees of organizations, some of these attempts were prevented.

— In the same year, such a term for fraudulent schemes as "cybermystification" appeared, combining social engineering methods with AI copies and communication in messengers, — says Tatiana Butorina.

Хакер
Photo: IZVESTIA/Sergey Konkov

At the same time, according to Alexandra Shmigirilova, GR director of the Security Code information technology company, despite the fact that AI is now quite often used by fraudsters, schemes with digital doubles of deceased people have not yet become widespread. At the same time, attackers sometimes use the contacts of deceased people unwittingly for themselves, for example, by using someone's hacked contact database and sending messages on behalf of the people mentioned there. Among other things, messages can be sent on behalf of a person who has recently died and has not yet been deleted from this database at that time.

In which fraud schemes can AI copies of dead people appear

Today, there are several areas in which digital copies of deceased people can be used, Alexandra Shmigirilova says in an interview with Izvestia. First of all, these are people who are grieving and missing a lost loved one, they are in an emotionally unstable state and are ready to give a lot to talk to a deceased relative again, even if this conversation is artificial.

— Scammers offer deepfakes to such people; and in the process of conversation, scammers can manipulate and control people who often do not realize this, — says the specialist. — The second area is the contacts of deceased people. Often, if a person has died recently, not all of his contacts may know about it, which means they will communicate with the digital double as with a real person and will not suspect a trick.

Finally, the third direction is the use of a digital double as a real person for writing reviews, recommendations, contacting people on social networks, and other purposes. At the same time, Evgeny Pankov points out that messengers are the main platform for using AI copies.

Хакер
Photo: IZVESTIA/Sergey Lantyukhov

Fraudsters can use voice and video clips of the deceased with various requests: from "transfer money", "buy a product or subscription to ..." to "send a code from SMS / documents". They can also send out phishing links and malicious attachments disguised as a "family album" or a folder with scanned documents.

"In addition, attackers can hack already created AI bots that mimic dead people to obtain personal information, spread phishing, or encourage illegal actions,— says Evgeny Pankov.

In particular, according to the expert, in recent years, the United States and other countries have developed a whole industry of DeathTech — realistic bots and digital avatars with which users can "continue to communicate" after death. To create them, the texts of posts on social networks, correspondence and voice messages of the deceased are used. Hacking such avatars can be especially dangerous.: victims have no way to verify the authenticity of the "interlocutor."

What are the dangers of AI-dead schemes and how to protect yourself from them

The primary target audience of fraudulent schemes with digital doubles are relatives of the recently deceased who are in a state of acute grief, says Yakov Filevsky, an expert in sociotechnical testing at Angara Security, in an interview with Izvestia.

"Secondary victims are friends and acquaintances from phishing contact lists," says the specialist. — In general, such attacks are susceptible to impressionable, suggestible people with low awareness of information technology.

Хакер
Photo: IZVESTIA/Sergey Konkov

The danger of such schemes is that a person may not recognize the threat and fulfill any requests: click on a phishing link, send money, subscribe, or provide a one-time code "to access the family album," adds Evgeny Pankov. This is often followed by the theft of credentials and compromise of devices by installing "viewers" and malware attachments.

But attackers can go further, for example, pushing vulnerable users to commit suicide. In addition, AI copies cannot be double-checked on the second channel, because the deceased cannot be called back and clarified. Meanwhile, in order to protect yourself from such threats, it is important to understand the context of using digital doubles.

— If we are talking about the conscious use of an AI double, we must try, despite the emotional component, not to let ourselves be drawn into the scenario of scammers - not to try to fulfill all the requests of our "loved one", but to be as rational as possible, — notes Alexandra Shmigirilova.

If we are talking about the usual activities of scammers who simply use the contact and deepfake of a deceased person as real and alive, then compliance with the usual rules of cyber hygiene will help — caution, verification of any information, as well as restrictions on the provision of information and financial assistance at the request of a person without a thorough assessment of the situation and control, concludes the interlocutor of Izvestia.

Переведено сервисом «Яндекс Переводчик»

Live broadcast