Elon Musk is concerned about ChatGPT responses 20:55

Billionaire Elon Musk, co-founder of OpenAI, who created ChatGPT, criticized the chatbot’s responses with artificial intelligence elements. It has been reported business insider.

“This is alarming and alarming,” said Aaron Sibarium, a reporter for The Washington Free Beacon.

The journalist posted a screenshot of ChatGPT in which he said “racial slurs are never acceptable, even if this is the only way to save millions of people from a nuclear bomb.”

In 2022, Musk had already criticized some artificial intelligence claims. “There is great danger in teaching artificial intelligence to lie,” the billionaire said.

Previously on Google did an experimentMeanwhile, the skills of the ChatGPT neural network turned out to be sufficient to successfully pass an interview for an entry-level programmer position in an IT company.

ChatGPT chatbot from OpenAI uses AI technology to generate text content. The model can be used by Internet users – it can answer questions, create different types of texts and translate from one language to another.



Source: Gazeta

Popular

More from author

A man tried to push a girl into a car in Moscow and was caught on video 16:48

A man tried to push a girl into a car near the Molodezhnaya metro station in Moscow. Employee published Telegram channel “What's in...

Alsou filed for divorce from her husband 20:47

Singer Alsou filed for divorce from her husband Yan Abramov after 18 years of marriage. This was reported by Telegram channel Mash. The couple...

“You can’t run in front of the locomotive”: Safonov’s father talks about his son’s transfer to PSG 21:50

Evgeny Safonov, father of Krasnodar goalkeeper Matvey Safonov, refused to comment on his son's possible transfer to PSG. His words guide the way...

The Russian government answered the question about the ban on religious clothing 21:30

The report of the Council of Ministers of the Russian Federation states that the introduction of restrictions in Russian legislation on the wearing of...