State Duma deputy Kiryanov explained how to use neural networks safely

No time to read?
Get a summary

When using neural networks, you may not mindlessly disclose important personal data, declaration Artem Kiryanov, Deputy Chairman of the State Duma Committee on Economic Policy, in an interview with Reedus. Previously, personal information of ChatGPT chatbot users was leaked to the Network. Names and surnames, bank card numbers, e-mail addresses and billing addresses were freely available.

“The Russian authorities are developing new data protection mechanisms, but unfortunately it should be understood that it is impossible to completely avoid the risk of leaking personal information. Users should try not to disclose vital data when using digital services. This is especially true for new applications that have not yet been tested. In addition, if we talk about ChatGPT, we must remember that it is an external service and we cannot influence it practically, ”Kiryanov explained.

He added that the ChatGPT chatbot often produces counterfeit products by responding to certain user requests.

before in Russia created Analog of ChatGPT. Domestic company Sistemma has released its already available SistemmaGPT neural network for business testing.

No time to read?
Get a summary
Previous Article

The recipe you didn’t expect: stuffed mushrooms in your deep fryer

Next Article

The Santa Fe group in the Copa Sudamericana 2023: rivals, calendar, fixtures, results and standings