Using ChatGPT for coding? Beware of sharing sensitive data

Published On Fri May 12 2023
Using ChatGPT for coding? Beware of sharing sensitive data

Have you given away secrets on ChatGPT? - Technology Org

OpenAI’s ChatGPT, a large language model that uses AI tools to generate useful text from a prompt, has gained over 100 million users in just 60 days. Microsoft invested $10 billion in Open AI and embedded GPT4 into Bing and made dedicated Bing apps with GPT4 available on both the App Store and Play Store. Many users have found it useful, including programmers, who use it to reduce their coding time.

However, Rob Nicholls, from UNSW Business School, warns people to be careful about what company information they give away while using ChatGPT. If a job description or prompt contains information that competitors can use to identify a business, the company's security is at risk.

Samsung engineers used ChatGPT to decrease their development time while developing codes but ended up giving away sensitive data that could compromise the company's security. Italian authorities also banned ChatGPT over privacy concerns, arguing that the data collected by ChatGPT breached the European General Data Protection Regulation. However, an age verification (over 18) check on users is expected to be introduced by the end of April.

Generative AI uses billions of data points to create text on a predictive basis, and it improves in response to user feedback. Thus, the challenge faced by businesses that employ curious people is that this feedback may include confidential company information. The solution to this is simple in theory but much harder in practice: if the material would not normally be disclosed outside the business, it should not be used as a prompt for ChatGPT or Bing.

So, next time you use ChatGPT, ask yourself if the output contains confidential data that should not be shared outside the company. If the answer is yes, do not use it.