How Contact Center Agents Put Your Sensitive Data at Risk

Published On Sat May 13 2023
How Contact Center Agents Put Your Sensitive Data at Risk

Contact Center Agents May Leak Sensitive Information to ChatGPT and Other Automation Tools

Contact center agents handle sensitive customer data regularly, making them privy to confidential information that can pose a security risk if leaked. One of the biggest concerns right now is the use of large language models (LLMs) such as ChatGPT, which agents might use to generate auto-responses to customer queries or complaints. The risk of them incorporating the confidential data into LLMs is substantial without proper data security measures in place.

To make matters worse, employees are increasingly using ChatGPT and its competitors for automating tasks and simplifying their job as the risk of using unauthorized third-party call recorders rises. Cyberhaven research found that 3.1 percent of employees copy confidential data into ChatGPT, indicating the potential for information leakage. The study also reports the severe risk of hacking and the retrieval of personally identifiable information (PII) and verbatim text sequences. Contact centers are at a higher risk of exposure as employees embrace these automation tools, and the security and compliance processes are compromised.

The rise of LLMs in the workplace has given employees opportunities to augment their work processes, and it's now crucial for companies to work with agents to implement authorized self-automation tools. Contact centers need to look beyond ChatGPT alone while creating a mitigation strategy for illicit agent productivity tools. Unauthorized tools like LLMs will become more prevalent, with Gartner forecasting that 30 percent of agents will be using such solutions by 2026.

Several companies are already taking steps to mitigate the risk of information leakage through unauthorized LLMs. Businesses like Walmart, Amazon, and Microsoft have issued warnings to employees, and JPMorgan Chase expedited the restriction of its workers' use of LLMs citing compliance concerns.

Contact centers need to provide authorized alternatives to mitigate the risk of exposure to confidential customer data. Self-automation has been happening for a while in the software space, and customer service reps now have better access to automation tools like LLMs. Emily Potosky, Director of Research at the Gartner Customer Service & Support Practice, said that "organizations that not only allow but authorize self-automation will become more competitive than those that don't, as reps will notice and correct inefficiencies that leaders are unaware of."

Contact centers that work with agents to implement new productivity tools will create more efficient and engaging ways of working. The best approach is to have close agent communication to review self-automation opportunities and pinpoint appropriate solutions.