Chatbots and Personal Data: Safeguards for North American Users

No time to read?
Get a summary

Chatbots such as ChatGPT can handle information in ways that users might not expect, especially when it comes to personal data. To minimize risk of data leakage, conversations should be approached as if speaking with a new person each time, and sensitive details should be kept private. This precautionary stance is advised by Zakhar Bushuev, a programmer, developer, and co-founder of the IT firm i2crm.

According to the expert, a bot can retain everything that passes through its memory during a session. The risk is that stored information could inadvertently be used in dialogues with other users. This potential for data leakage highlights the need for careful handling of what is shared with the bot and how it is structured in conversation.

To reduce exposure, a straightforward set of digital hygiene practices is recommended. Users should recognize the importance of personal data and avoid sharing passwords, bank card numbers, home addresses, or any details tied to private individuals, including friends and family members. Even what appears to be common knowledge, like a spouse’s maiden name, can be leveraged to infer more sensitive data about someone else.

When productive bots are employed to draft legal documents, it is prudent not to disclose addresses, monetary figures, or the names of any companies or individuals involved in a dispute. Utilizing fictional identifiers is a safer approach in these scenarios to prevent real data from being exposed or misused.

For organizations that rely on bots as a daily working tool, establishing clear internal rules is essential. This includes defining what constitutes personal data, protecting trade secrets, and outlining information that should never be transmitted to an application or bot as a whole.

There have been notable instances where corporate communications with AI tools have led to the inadvertent leakage of confidential information. These events underscore the importance of implementing robust policies, secure data handling practices, and ongoing education for employees about data privacy and the risks associated with automated assistants.

No time to read?
Get a summary
Previous Article

Poland, Memory, and Debate: Reframing a Wartime Discussion

Next Article

Omega Taps Jonathan Bailey as Global Brand Ambassador