OpenAI's ChatGPT First Public Data Leak
After generating immense excitement with its highly human-like chatbot, ChatGPT, OpenAI faced its first public data leak on March 24. A software bug exposed some users' information to other active users then. Despite ChatGPT's novel capabilities, the data security and governance risks it presents are similar to those that have always existed.
OpenAI revealed that the glitch enabled certain users to view other active users' chat history titles and inadvertently exposed payment-related information for 1.2% of ChatGPT Plus subscribers during a specific nine-hour window. As a result, affected users' first and last names, email addresses, payment addresses, and their credit card's last four digits and expiration dates were visible to others.
In a blog post addressing the incident, OpenAI emphasized its commitment to user privacy and data protection, admitting that it had fallen short of its commitment and users' expectations.
Given the chatbot's unprecedented adoption rates. The data leak serves as a reminder of the data security risks associated with new technologies and the importance of robust governance strategies.
Inadvertent disclosures of proprietary, confidential information aren't new, as evidenced by accidental e-discovery disclosures in 2022. However, risks associated with using a free chatbot like ChatGPT for confidential information are comparable to forwarding sensitive files to personal email accounts, using personal Dropbox folders for confidential data, or downloading files to separate personal devices.
While technology and market conversations on use cases have matured, governance discussions have lagged. Some organizations have banned ChatGPT for professional use, while others have adopted more nuanced approaches. ABA guidelines provide adequate guidance for lawyers interested in using such technology. As technology evolves, firms must consider how these innovations fit into their governance strategies. The importance of considering a broader strategy and governance approach for using large foundation models and applications powered by them rather than focusing on individual applications.