Global Risk Hub | S-RM

Rolling with the times: Technological progression drives criminal innovation | Travel Security Special Edition 2024

Written by Shannon Lorimer | Dec 28, 2023 2:59:57 PM

The recent release of generative artificial intelligence tools, together with increased user engagement with a proliferation of digital dating and payment platforms, have enhanced criminal opportunities and capabilities to carry out virtual scams, including extortion and fake kidnappings, writes Shannon Lorimer. 

In June 2023, the Federal Bureau of Investigation (FBI) reported an increase in US extortion cases using “deepfakes” – manipulated versions of original photographs and videos, extracted from victims’ social media, private messages and video chats – to threaten relatives into paying extortion or ransom demands. Artificial intelligence (AI) powered content has increasingly been used in virtual kidnappings and other scams, with several incidents over the last year highlighting the downside of rapidly advancing digital technologies. Compounding this, continually rising user engagement with online social media platforms and digital applications has provided greater opportunities for criminal engagement with potential targets. As criminals exploit these tools and platforms for financial gain, opportunities emerge for more polished crimes on a significantly larger scale.   

AI-empowered: Criminal tactics evolve

Where online extortion and virtual kidnappings have historically been perpetrated by hackers with sophisticated skills, criminals with limited capabilities and resources can now easily and realistically mimic an individual, allowing them to enhance the credibility of a threat and convince the intended victim that a relative or employee is in immediate danger. Tools designed specifically for criminal purposes are also emerging; in August 2023, English and Russian advertisements appeared on popular messaging platforms and hacking forums offering AI-powered software and training to create deepfake videos for phishing scams, extortion, and fraud, with training costing between USD 20-200, while ready-made videos priced at USD 250. 

AI’s ability to automate interactions with potential victims further boosts criminal arsenals. Just as businesses increasingly use chatbots to assist online users, criminals can set up automated responses impersonating people and businesses like banks, generating faster engagement with thousands of potential victims. Increased use of such technology in the commercial space also makes it difficult to distinguish between legitimate versus deceitful bots.

AI-Powered scams

 
India

In July 2023, criminals in Kerala used deepfakes to impersonate their victim’s former colleague, requesting money for a medical emergency. After the victim transferred IND 40,000 (USD 480), the perpetrators requested more money, which prompted the victim to alert authorities. 

US

In January 2023, a woman in Arizona reported that criminals used AI to clone her daughter’s voice, claiming that she had been kidnapped. During ransom negotiations, relatives located the daughter, who was safe and on holiday at the time.

 

A wider net

Kidnappers, extortionists, and scammers are also capitalising on increased engagement with online social media platforms and apps to mine for personal information and bait victims.

Online dating remains extremely popular; 366 million people used dating apps in 2022, and the most popular platform, Tinder, reported 70 million users worldwide in 2023. High user volumes have widened the available victim pool for criminals looking to exploit vulnerable individuals. Notable cases of traditional and express kidnapping facilitated through dating apps have been recorded in Brazil, South Africa, and the US, in some cases targeting LGBT platforms. In Johannesburg, South Africa, seven men baited and kidnapped a university student through the LGBT dating app Grindr, demanding USD 30,000 from his family. The ease with which criminals can identify and lure potential targets – especially amid a popular tendency to share itineraries, locations, and other personal information – will continue to provide thousands of potential victims for such crimes. 

Meanwhile, the increased availability and use of money transfer and banking apps, like Brazil’s PIX, has prompted concerns over the potential to make express kidnappings, scams and even muggings, easier and more lucrative. Brazilians lost around BRL 2.5 billion (USD 500 million) to financial scams in 2022, 70 percent of which were conducted via PIX. Meanwhile, the ability to transfer large amounts of money via phone or laptop reduces the need to drive victims to multiple ATMs and hold them overnight to circumvent withdrawal limits, commonly the case in express kidnappings.  

Case studies


Criminal groups specialise in dating app scams in São Paulo

In São Paulo, a surge in online dating scams has coincided with the increase in dating services and mobile payment applications. Police statistics indicate that nine out of 10 kidnappings were orchestrated through dating apps, often targeting men between the ages of 30 and 65. Malicious actors use fake profiles to organise dates, luring victims to isolated areas, where victims have reported being met with groups or pairs of kidnappers who use force to demand access to their bank accounts.  

In June 2023, a doctor was kidnapped in Brasilândia, a district in São Paulo, after arranging a meeting through a dating application. The victim was abducted by three armed men and held hostage for 30 hours, until he was rescued. While he was held captive, the kidnappers transferred BRL180,000 (USD 36,807) from his account.

 

Banking apps augment kidnappings in South Africa

South African banks have encouraged the adoption of digital payment methods, including linking bank cards to smartphones for contactless payments. This coincides with a recent uptick in reported kidnappings wherein perpetrators force victims to transfer all available funds from their digital wallets to accounts outside of South Africa, or make massive online purchases. Such cases are also increasingly initiated through vehicle hijackings at gunpoint. The banking sector has reported incidents perpetrated by specialised syndicates as well as opportunistic individuals, and businesspeople or individuals perceived to be wealthy have been specifically targeted. Professional platforms like LinkedIn, where users display up-to-date information and contact details, have already been a target for criminals using web scraping tools to compile victim lists.       

A losing battle? 

Regulations around AI are still catching up, and mechanisms to limit the use of AI-powered tools for criminal purposes is similarly lacking. Some governments have made progress in developing regulatory frameworks, including obligations for creators to disclose the use of AI-generated and manipulated content, although such regulations are difficult to enforce. Some companies have also developed software that can identify manipulated voices and images, although solutions are often geared towards the commercial sector rather than private individuals. Meanwhile, applications like Tinder have introduced intensified user verification processes, while PIX has attempted to protect users by setting transfer limits and restrictions on nighttime transaction hours to offset typical times in which express kidnappings take place.  

Nevertheless, criminals will also continue to leverage technological advancements to improve their own ability to circumvent these measures and operate undetected. The wider adoption of AI technologies, alongside a growing online audience, will simultaneously contribute to evolving criminal tactics and opportunities to enact more convincing, refined scams in the coming years.