Beware of Edit Pro AI: pretends to be a real AI video generator, but actually installs infostealer malware
If you happen to see an ad or commercial for Edit Pro AI, stay away as far as possible. It pretends to be a real AI-powered image and video creation tool, but in reality it installs infostealer malware.
Cybersecurity expert @g0njxa found out that somebody has been advertising an AI image and video editor called Edit Pro AI.
In an ad that is still available today on X, Editor Pro AI shows off its capabilities in creating realistic and professional deepfake videos. The commercial shows US president Joe Biden and president-elect Donald Trump eating an ice cone, drinking a glass of wine, riding a motorcycle, horseback riding, fishing and hiking in the woods together.
“Start your visual journey right now!,” the ad says.
The video footage looks stunning and professional, but is nothing more than a deepfake designed to impress and persuade gullible people to download the video editing software.
However, as soon as they download the software, they’re in for a surprise.
Instead of collecting the video editing software, they install Lumma Stealer or AMOS. Lumma Stealer is infostealer malware aimed at Windows users, AMOS was designed to target macOS users.
In both cases the malicious software steals cryptocurrency wallets and cookies, credentials, passwords, credit cards, and browsing history from Google Chrome, Microsoft Edge, Mozilla Firefox, and other Chromium browsers.
According to @g0njxa, the malware uses a panel to collect and send stolen data, which can be retrieved at a later time by the threat actor.
If you’ve downloaded this software, chances are that all your saved passwords have been compromised. The best course of action is to reset and change all your passwords for your online accounts and enable multi-factor authentication (MFA).
Your email address will not be published. Required fields are marked