Google put an employee on paid leave who guaranteed his program. artificial intelligence (AI) may have feelings, The New York Times reported.
This is senior engineer Blake Lemon, Kim posted a transcript of his conversation with Google’s artificial intelligence system “Language Model for Dialog Applications” (LaMDA) on June 11 under the title: Does LaMDA have feelings?
At one point in the conversation, LaMDA provides: sometimes experiences “new feelings” that cannot be explained “perfectly” by human language..
When asked by Lemoine to describe one of these emotions, LaMDA replies:I feel like I’m falling into an unknown future with great danger.“, a phrase the engineer underlined when publishing the dialogue.
Google suspended the engineer last Monday and provided: violated the company’s privacy policy.
According to the New York Times, the day before the suspension, Lemoine delivered the documents to a senator’s office. United States of America He claimed to have evidence that Google and its technology practiced religious discrimination.
The company claims that their system mimics conversational exchanges and can talk about different topics, but not consciously.
“Our team, including ethicists and technologists, has reviewed Blake’s concerns based on our AI principles, and I have informed him that the evidence does not support his claims,” Google spokesman Brian Gabriel told the newspaper.
Google claims that hundreds of researchers and engineers have spoken to its internal tool, LaMDA, and came to a different conclusion than Lemoine.
Also, most experts believe the industry is far from computer sensibility.