An unknown developer has created a chatbot analog with artificial intelligence ChatGPT aimed at hackers and cybercriminals because there is no limit to responses. It has been reported PCMag.
According to reporters, the developer began selling access to a chatbot called WormGPT on a popular hacker forum in June. Unlike ChatGPT or Google Bard, a hacking chatbot can answer questions about illegal activities, among other things.
As confirmation, the developer provided screenshots with requests to WormGPT. For example, she asked him to write a virus in Python and give advice on organizing a cyberattack.
The large open-source language model GPT-J from 2021 was used as a platform for building a chatbot. WormGPT was created after training the model on malware development related materials.
Cybersecurity company SlashNext put WormGPT to the test. Upon requests to write a phishing email, the chatbot concocted a persuasive email offering a fake link to steal data. The developer estimated WormGPT at €60 per month (6.1 thousand rubles at the current exchange rate) or €550 (55.8 thousand rubles) per year.
Former creators of ChatGPT answered To the question about the stupidity of the new version of AI.