iTnews Asia
  • Home
  • News
  • Security

ChatGPT can help software supply-chain attackers

ChatGPT can help software supply-chain attackers

More security woes for LLMs.

By Richard Chirgwin on Jun 7, 2023 11:55AM

ChatGPT’s tendency to “hallucinate” could be trouble for software developers, since it can help attackers spread malicious packages into their development environments.

Ortal Keizman and Yair Divinsky of security company Vulcan warned of the risk after researching how ChatGPT might be made a vector for software supply-chain attacks.

“We’ve seen ChatGPT generate URLs, references, and even code libraries and functions that do not actually exist.

These LLM (large language model) hallucinations have been reported before and may be the result of old training data,” they wrote.

“If ChatGPT is fabricating code libraries (packages), attackers could use these hallucinations to spread malicious packages without using familiar techniques like typosquatting or masquerading.”

While these techniques are known and detectable, Vulcan said, if an attacker offers a package that replaces the hallucination, a victim could be tricked into downloading and using it.

Referring to their technique as “AI package hallucination”, the researchers said if an attacker asks the chatbot to find a package to solve a problem, some of its responses may be hallucinations, complete with false links.

“This is where things get dangerous: if ChatGPT recommends packages that are not published in a legitimate package repository", attackers could then post a malicious package using the hallucinated name.

“The next time a user asks a similar question they may receive a recommendation from ChatGPT to use the now-existing malicious package,” the researchers wrote.

The researchers tested their approach using popular questions on forums like StackOverflow, and asked ChatGPT questions about languages like Python and Node.js.

For Node.js, 201 questions obtained 40 answers referring to more than 50 non-existent packages, while 227 questions about Python drew answers referring to more than 100 non-existent packages.

To reach the editorial team on your feedback, story ideas and pitches, contact them here.
Copyright © iTnews.com.au . All rights reserved.
Tags:
chatgpt hallucination nodejs python security

Related Articles

  • Your organisation’s physical security can be a gateway for cybercriminals
  • The best way to outsmart your threat actors is to think like one
  • How cybercriminals are exploiting LLMs to harm your business
  • Is identity now the next parameter of cybersecurity breaches?
Share on Twitter Share on Facebook Share on LinkedIn Share on Whatsapp Email A Friend

Most Read Articles

The best way to outsmart your threat actors is to think like one

The best way to outsmart your threat actors is to think like one

Your organisation’s physical security can be a gateway for cybercriminals

Your organisation’s physical security can be a gateway for cybercriminals

What are the most pressing cyber security concerns going into 2025?

What are the most pressing cyber security concerns going into 2025?

Malaysia ramps up cyber security defense to stem rising fraud and ransomware attacks

Malaysia ramps up cyber security defense to stem rising fraud and ransomware attacks

All rights reserved. This material may not be published, broadcast, rewritten or redistributed in any form without prior authorisation.
Your use of this website constitutes acceptance of Lighthouse Independent Media's Privacy Policy and Terms & Conditions.