- Philip Osadebay - Tech Journalist
Fake ChatGPT apps used by hackers to penetrate Android and Windows malware
ChatGPT has gained a lot of attention since its November 2022 release, becoming the fastest growing consumer application in modern history with over 100 million users as of January 2023. Attackers are leveraging the popularity of OpenAI's ChatGPT chatbot to distribute malware for Windows and Android, and lure unsuspecting Victim to phishing sites.
The move created the conditions for attackers to capitalise on the tool's popularity by promising uninterrupted free access to premium ChatGPT. The offer is luring as its purpose is to trick users into installing malware or providing account credentials.
Security researcher Dominic Alvieri found that the domain "chat-gpt-pc.online" was used to target visitors under the disguise of downloading the ChatGPT Windows desktop with the information-stealing Redline-Infect malware. It was also discovered that this fake ChatGPT app was promoted on Google Play and third-party Android app stores, imposing questionable software onto people's devices.
Further investigation found out that chatgpt-go.online distributes clipboard-stealing malware and the Aurora stealer.If was also discovered that there was a credit card theft site at pay.chatgptftw.com. The site allegedly offers visitors a payment gateway to purchase ChatGPT Plus from there.
Regarding the fake apps, over 50 detected malicious applications using the ChatGPT icon and similar names which are fake and trying to perform malicious activities on user's device. Two examples highlighted are the SMS billing fraud app chatGPT1 and AI Photo, which contains Spynote malware that can steal call logs, contact lists, SMS, and files from devices.
ChatGPT is an online tool only available on chat.openai.com and does not currently offer mobile or desktop apps for any operating system. Other apps and websites claiming to be ChatGPT are fake and attempt to cheat or infect malware and should be considered at least suspicious and users should avoid them.
Commenti