Hackers are turning Artificial Intelligence into a weapon

English Section / 16 februarie

Hackers are turning Artificial Intelligence into a weapon

Versiunea în limba română

The technology used by criminals eventually becomes a weapon. AI tools such as ChatGPT are being used by hackers to improve their cyber attacks, warn even OpenAI and Microsoft, which released a joint warning that hackers are already using language patterns to strengthen their tactics. Warnings about this possibility have been issued so far by various other entities, but the two American companies also come with evidence. They say Chinese, Russian, Iranian and North Korean hacker groups are already using tools like ChatGPT to create better scripts and improve their social techniques.

The latter are used to create more convincing phishing campaigns, after which users are more easily tricked into providing personal information. There are also situations with military implications. For example, Stontium, a hacker group backed by Russian espionage, used AI to decipher satellite communications protocol, radar information and various other sensitive technical parameters. The two companies' warning comes as their own AI tools, ChatGPT and Copilot can be used by hackers in attacks.