Bitdefender Labs warns of fresh phishing campaign that uses copycat
ChatGPT, the AI-powered chatbot developed by OpenAI lab, rocketed to fame within just four months of its launch.
Unfortunately, the success of the viral AI tool has also attracted the attention of fraudsters who use the technology to conduct highly sophisticated investment scams against unwary internet users.
According to Bitdefender Antispam Labs, the latest chapter of “AI-powered” swindles begins with a simple unsolicited email.
Subject lines include:
- ChatGPT: New AI bot has everyone going crazy about it
- ChatGPT: New AI bot has everyone in shock from it
- New ChatGPTchatbot is make everyone crazy now – but it’ll very soon be as mundane a tool as Google
- Why is all people panic about ChatGPT bot?
In fact, what appears to be a benign marketing lure quickly caught the attention of our researchers, who then proceeded to reveal an intricate fraud scheme that threatens the wallets and identity of participants.
For now, the scheme targets Denmark, Germany, Australia, Ireland and the Netherlands.
The email appears to offer recipients little information, unless they access the embedded link, of course.
How the scam works
Although fake ChatGPT apps have surfaced on both Google and Apple app stores in the past couple of weeks, offering users monthly or weekly based subscriptions to use the tool, scammers behind this particular scheme go above and beyond to dupe consumers.
Upon accessing the link in the email, users are directed to a copycat version of ChatGPT luring users with financial opportunities that pay up to $10,000 per month “on the unique ChatGPT platform.”
As you most likely know, the official ChatGPT is an AI-driven language processing tool that enables human-like written conversations with a chatbot that can answer questions, help compose emails, essays and much more. It is not an investment or financial platform meant to help you earn money.
The phony platform’s “chatbot” begins with a short intro to its role in analyzing financial markets that can allow anyone to become a successful investor in global stocks. We agreed to play along and allow the “automatic robot created by Elon Musk,” to help us get rich. Before we begin our investment journey, the chatbot needs to calculate our daily income.
After being asked about the level of satisfaction of our current income level, the fake ChatGPT bot asks us to verify if we are “real” by entering an email address. We obliged, and the “AI” was more than willing to help us out.
Nothing too shady until we head to the next part. Without giving the phony platform any personal financial information that could help it generate a potential estimate of a “monthly user income,” the bot miraculously estimates a daily income of $420, and says the amount that “could get even bigger” in a week. Now we are asked to provide additional contact information for our very own “personal assistant” to help us activate a WhatsApp account dedicated to our earnings.
We couldn’t resist, and provided the chatbot with a valid phone number, and waited patiently to be contacted by a representative.
In 10 minutes, we received a call from a lovely young woman who spoke Romanian. She was kind enough to give us more information on how we can start earning money very quickly by investing in “crypto, oil, and international stock.”
The woman was very polite and curious about just how much our researcher makes. She asked how much money he was able to invest today, alluding to the fact that the minimum is 250 euros.
While she kept insisting on switching to WhatsApp to begin a financial analysis and set up an account, she said that we would need to provide the last six digits of a valid ID card. We skipped this step, asking her to send us an email containing the link we needed to access to transfer the 250 euros and begin our new financial journey as rookie investors.
As a side note, our researcher could hear many voices speaking on the phone to other victims, in what sounded like a call center-style environment.
After a couple of minutes of speaking with the employee, she asked one of her colleagues to send us the form we needed to fill out to begin investing.
We were advised to enter the amount of 250 EUR and send a screenshot showing that the payment is still processing.
As you can see, the form requires a variety of personal information, including first and last name, date of birth, physical address and payment method.
We used a made-up credit card number to see what would happen next.
The “payment” did not go through, and the woman advised our researcher to repeat the process, making sure that there are no typos. She also said to be very careful when typing to prevent the bank from blocking the credit card.
A few interesting notes our researchers pointed:
- The employee who spoke with our researcher said she was working for Import Capital a London-based company. Our investigation revealed that the domain importcapital[.]cc (as seen in our email correspondence with the company) also appeared in an alert from the Financial Conduct Authority (FCA) stating that the firm is not authorized to conduct business in the UK.
- Antispam Lab researchers also spotted a couple of variations of the phishing campaign leveraging the name of Google and Meta. Similarly to the ChatGPT scam, these messages asked customers to fill out a form with their contact information.
- The payment website lacks any descriptive text mentioning Import Capital, ChatGPT or any receiving entity whatsoever. Moreover, the payment form is freely accessible without any prior verification or security measures, directly from the domain’s main page.
- The fake version of ChatGPT was accessible via an already blacklisted domain https://timegaea[.]com. The copycat version did not allow our researcher to freely interact with the “chatbot”. You could only select predefined answers.
- The employee asked about any family members that might have a larger income and seemed highly intrigued when our researcher said yes.
- The employee also queried our researcher about any savings accounts.
- Users could select multiple currencies before filling out the payment form. This includes USD, EUR, GBP, AUD and CAD.
- Payment methods include debit cards, credit cards and Klarna
- The phone call received by our researchers came from the UK and the employee said she was working out of London.
- The scam operation seems to be targeting users around the globe
- The WhatsApp verification process requires users to give the last six digits of their ID. Paired with the information from the payment form (e.g. date of birth), this could allow the scammers to put together a full ID number and commit identity theft.
How users can protect against this and similar scams leveraging ChatGPT’s name
Scammers using new viral internet tools or trends to defraud users is nothing new. If you’re looking to test out the official ChatGPT and its AI-powered text-generating abilities, do so only using the official website.
Don’t follow links you receive via unsolicited correspondence and be especially wary of investment ploys delivered on behalf of the company, they are a scam.
If you need more peace of mind when roaming the world wide web and interacting with apps, or unsolicited correspondence, take a look at Bitdefender’s all-in-one plans. Our comprehensive security solutions protect you against malicious attacks, phishing and scam websites, and if you happen to fall victim to identity thieves, our identity theft protection service will help you recover from any incidents via identity and credit monitoring plus reimbursement of funds courtesy of up to $2 million insurance, depending on your chosen plan.