What Are The Privacy Risks Of ChatGPT-4o? “They Stress that the Training Information Is not Used to Profile People, or to Learn About Them”

May 18th, 2024

Via: Forbes:

The privacy implications of ChatGPT are two-pronged, says Oliver Willis, partner at BDB Pitmans. “From a user’s perspective, how does ChatGPT collect and use data about you when you are using it? From everyone else’s perspective, was ChatGPT trained on information about you and what will it tell users about you?”

In its privacy policy, OpenAI acknowledges that the information used to train ChatGPT includes personal data, says Willis. “They stress that the training information is not used to profile people, or to learn about them, but some people will see the use of this data as inherently intrusive.”

OpenAI also acknowledges that using personal data to train ChatGPT means responses sometimes includes information about individuals. “OpenAI offers a mechanism for restricting the use of their data to train ChatGPT, but it is less clear what OpenAI will do for someone who objects to it disclosing their personal data in a chat response,” says Willis.

ChatGPT collects all the data inputted by a user and will retain that information indefinitely to train its models unless you opt out—which isn’t easy to do, says Matthew Holman, partner at Cripps LLP.

“In practice it is really hard for individuals to exercise GDPR rights against large language models (LLMs) such as ChatGPT,” says Holman. For example, he says, it can “create inaccurate information or hallucinate; it is able to change information without explanation and it can be almost impossible to have your data erased once it is imputed into the LLM.”

Leave a Reply

You must be logged in to post a comment.