Tech News
OpenAI uses its own models to fight election interference

OpenAI, the organization behind the ChatGPT generative AI solution, has reported blocking over 20 malicious operations and networks globally in 2024. These operations varied in scope and purpose, ranging from creating malware to generating fake content such as media accounts, bios, and articles.
According to OpenAI, their analysis of the halted activities revealed that threat actors are continuously trying to exploit their models but have not made significant progress in developing new malware or gaining viral traction. This is particularly crucial during election years in countries like the United States, Rwanda, India, and the European Union.
One notable achievement for OpenAI was thwarting a China-based threat actor named “SweetSpecter,” who attempted to target OpenAI employees through spear-phishing attacks. Additionally, OpenAI collaborated with Microsoft to disrupt an Iranian covert influence operation known as “STORM-2035.”
Despite the attempts by threat actors, the social media posts generated by OpenAI’s models did not gain much engagement, as they received minimal comments, likes, or shares. OpenAI reassures that they will remain vigilant in monitoring and preventing the misuse of advanced AI models for malicious purposes.
-
Destination6 months ago
Singapore Airlines CEO set to join board of Air India, BA News, BA
-
Breaking News7 months ago
Croatia to reintroduce compulsory military draft as regional tensions soar
-
Tech News10 months ago
Bangladeshi police agents accused of selling citizens’ personal information on Telegram
-
Breaking News8 months ago
Bangladesh crisis: Refaat Ahmed sworn in as Bangladesh’s new chief justice
-
Gaming7 months ago
The Criterion Collection announces November 2024 releases, Seven Samurai 4K and more
-
Toys8 months ago
15 Best Magnetic Tile Race Tracks for Kids!
-
Toys10 months ago
15 of the Best Trike & Tricycles Mums Recommend
-
Guides & Tips8 months ago
Have Unlimited Korean Food at MANY Unlimited Topokki!