iTnews Asia
  • Home
  • News
  • Security

ChatGPT can help software supply-chain attackers

ChatGPT can help software supply-chain attackers

More security woes for LLMs.

By Richard Chirgwin on Jun 7, 2023 11:55AM

ChatGPT’s tendency to “hallucinate” could be trouble for software developers, since it can help attackers spread malicious packages into their development environments.

Ortal Keizman and Yair Divinsky of security company Vulcan warned of the risk after researching how ChatGPT might be made a vector for software supply-chain attacks.

“We’ve seen ChatGPT generate URLs, references, and even code libraries and functions that do not actually exist.

These LLM (large language model) hallucinations have been reported before and may be the result of old training data,” they wrote.

“If ChatGPT is fabricating code libraries (packages), attackers could use these hallucinations to spread malicious packages without using familiar techniques like typosquatting or masquerading.”

While these techniques are known and detectable, Vulcan said, if an attacker offers a package that replaces the hallucination, a victim could be tricked into downloading and using it.

Referring to their technique as “AI package hallucination”, the researchers said if an attacker asks the chatbot to find a package to solve a problem, some of its responses may be hallucinations, complete with false links.

“This is where things get dangerous: if ChatGPT recommends packages that are not published in a legitimate package repository", attackers could then post a malicious package using the hallucinated name.

“The next time a user asks a similar question they may receive a recommendation from ChatGPT to use the now-existing malicious package,” the researchers wrote.

The researchers tested their approach using popular questions on forums like StackOverflow, and asked ChatGPT questions about languages like Python and Node.js.

For Node.js, 201 questions obtained 40 answers referring to more than 50 non-existent packages, while 227 questions about Python drew answers referring to more than 100 non-existent packages.

To reach the editorial team on your feedback, story ideas and pitches, contact them here.
Copyright © iTnews.com.au . All rights reserved.
Tags:
chatgpt hallucination nodejs python security

Related Articles

  • Are third-party blind spots the weakest link in enterprise cybersecurity chain?
  • Five tips a CIO or CSO should know to stop employee-driven IP theft
  • StarHub launches app to protect customers from scam calls and SMS
  • Beware the rise of ‘vishing’ as a cyber threat in APAC
Share on Twitter Share on Facebook Share on LinkedIn Share on Whatsapp Email A Friend

Most Read Articles

Are third-party blind spots the weakest link in enterprise cybersecurity chain?

Are third-party blind spots the weakest link in enterprise cybersecurity chain?

Philippine education ministry hit by data leak exposing 210,020 records

Philippine education ministry hit by data leak exposing 210,020 records

Indonesia's national data centre suffers ransomware attack

Indonesia's national data centre suffers ransomware attack

Five tips a CIO or CSO should know to stop employee-driven IP theft

Five tips a CIO or CSO should know to stop employee-driven IP theft

All rights reserved. This material may not be published, broadcast, rewritten or redistributed in any form without prior authorisation.
Your use of this website constitutes acceptance of Lighthouse Independent Media's Privacy Policy and Terms & Conditions.