Russians told about cybersecurity rules when using chatbots

No time to read?
Get a summary

Chatbots like ChatGPT are unfamiliar with the concept of “personal data” and don’t know how to handle it, so you need to communicate with them as if you were talking to a stranger to avoid leaks. This was told socialbites.ca by the programmer-developer and co-founder of the i2crm IT company Zakhar Bushuev.

“The bot keeps in its memory everything that comes to it. In this case, there is a possibility of information leakage. This can happen if the bot decides to use other people’s data in dialogs with other users,” explained the expert.

To prevent such incidents, Bushuev recommended following a few digital hygiene rules. For example, you should be aware of the importance of data and in no case should you tell the bot app passwords, bank card codes, addresses, as well as personalized data such as the names of friends and relatives.

“This may seem trivial at first glance, but often through additional information (conditionally, the wife’s maiden name), you can find and use other data,” said the expert.

Productive bots like ChatGPT are sometimes used to prepare legal documents. In this case, according to Bushuev, it is impossible to transfer the addresses, amounts of money, names of companies and individuals involved in the dispute. He says it’s better to use fictitious identifiers.

In cases where bots are used as a working tool in the corporate environment, the expert suggested creating a regulation and regulating the concepts of personal data, trade secrets and information that cannot be transferred to an application in general.

Formerly socialbites.ca Wrote That Samsung employees’ communication with ChatGPT resulted in the leaking of some of the company’s confidential data.

No time to read?
Get a summary
Previous Article

Russians sought the reason for the increase in imported car prices

Next Article

Scientists have developed an effective system for delivering stroke drugs to the brain