Hackers have been found selling stolen Generative AI data and account credentials on the dark web, exploiting the technology’s growing popularity to net big rewards.
New research from eSentire’s Threat Response Unit (TRU) has identified over 400 account credentials are sold by cybercriminals every day. Primarily obtained from corporate end users’ computers that are infected with infostealer malware which retrieves anything the user has entered into their internet browser. This could include sensitive information such as bank details, financial records, customer data, and log-in information.
Additionally, if end-users are subscribed to a GenAI service or model, then these credentials are stolen. When an infostealer is used to capture information, the ‘Stealer log’ of stolen data is then sold for around $10. OpenAI credentials are reported to be the most commonly stolen, with an average of 200 daily listings.
LLM Jacking
Elsewhere, findings from security research organisation Sysdig also showed threat actors are also gaining control of extensive numbers of LLMs (Large Language Models) in a process dubbed ‘LLM Jacking’. TRU warns that hacker’s aims are to acquire, resell, and abuse access to LLMs.
Sysdig has confirmed LLM Jacking often uses a reverse proxy to resell and monetize their LLM access, and has warned an attack of this kind could cost the victim up to $46,000 per day in consumption costs.
Underground stores like LLM Paradise used this tactic to obtain and sell stolen GenAI credentials, even brazenly advertising on sites like TikTok. Whilst this site has since been closed down, a healthy market ensures many others remain in its place.
As the use of AI has grown, so too has the threat of cybercriminals discovering new ways to profit from stolen data. Companies are advised to maintain rigorous security measures, such as establishing robust vulnerability management processes, monitoring for suspicious activity, and multi-factor authentication.