April 1, 2023
Cyberwarfare / Nation-State Attacks , Fraud Management & Cybercrime , Social Engineering Vovan and Lexus Pose as Ukrainian Officials to Play Pranks on Kremlin Critics Anviksha More (AnvikshaMore) • March 7, 2023     Vladimir "Vovan" Kuznetsov, left, and Alexei "Lexus" Stolyarov during a conference hosted by the Eastern Economic Forum in Vladivostok in September…

Cyberwarfare / Nation-State Attacks , Fraud Management & Cybercrime , Social Engineering

Vovan and Lexus Pose as Ukrainian Officials to Play Pranks on Kremlin Critics Anviksha More (AnvikshaMore) • March 7, 2023     Vladimir “Vovan” Kuznetsov, left, and Alexei “Lexus” Stolyarov during a conference hosted by the Eastern Economic Forum in Vladivostok in September 2022 (Image: RIA Novosti)

A Russian threat actor headed by two prank callers whose targets for duplicity coincide with Kremlin state interests has for a year now leaned heavily into using email to schedule video calls with high-profile North American and European officials and executives who voice support for Ukraine or make public statements about Russian disinformation.

See Also: OnDemand | Navigating the Difficulties of Patching OT

Researchers from Proofpoint track the group as TA499, although it’s more commonly known as Russian comedians Vladimir Kuznetsov and Aleksei Stolyarov, aka Vovan and Lexus. Kuznetsov’s last name may also be Krasnov.

The duo uses Ukrainian email provider Ukr.net to craft emails purportedly from Ukrainian Prime Minister Denys Shmyhal and his assistant. The emails invoke entities such as Kyiv’s embassy in Washington by using addresses such as [email protected], shows research from Proofpoint published Tuesday.

Apparently inspired by mid-August concerns from International Atomic Energy Agency Director General Rafael Grossi over Russian military shelling of the Zaporizhzhia nuclear plant, Kuznetsov and Stolyarov also obtained the [email protected] address to target senior government officials.

Proofpoint says the duo tends to cycle through various email addresses tied to different fake identities such as emails purporting to come from oleksandrmerezhko.com, which is the name of a member of the Ukrainian parliament, and from the domain of navalny.team. The latter refers to Russian opposition leader Alexei Anatolievich Navalny, who is serving an 11.5-year prison sentence in a high-security penal colony. Amnesty International called his most recent trial in 2022 a “sham” that violates “international human rights law and clearly deprives Navalny of his right to a fair trial.” Navalny nearly died in 2020 after being poisoned with the nerve agent Novichok by the Russian Federal Security Service.

Through the emails, which do not contain malware, the duo attempts to solicit information and entice recipients into agreeing to a remote video or phone call. The calls themselves might attempt to lead subjects into voicing Russian opposition themes and typically degenerate into antics. The Russian men post heavily edited footage from the videos onto YouTube and other outlets including RuTube, Telegram and VKontakte.

The duo says it’s been banned from YouTube multiple times, and Stolyarov told Russian state-owned news agency Sputnik on Monday that YouTube again revoked their access.

In the past year, the group has targeted former Chancellor of Germany Angela Merkel, former U.S. President George W. Bush, an Australian Senate committee considering tougher government actions against the Kremlin, and U.K. Defense Secretary Ben Wallace. The British government called on YouTube to remove the Wallace video, expressing confidence that YouTube “would not wish to be a conduit for Russian propaganda.”

Past victims also include Canadian Prime Minister Justin Trudeau, U.S. Sen. Lindsey Graham, R-S.C., a slew of members of the United Kingdom, Latvia, Estonia and Lithuania parliaments, Elton John and author J.K. Rowling.

In a shift from their activity before Russia’s February 2022 invasion of Ukraine, Kuznetsov and Stolyarov last year almost exclusively focused on topics relating to the Russia-Ukraine war.

Some reporting has suggested the Russian men use artificial intelligence to create deepfake doppelgängers of real-life individuals opposed to the Kremlin, but they deny it. Proofpoint says the pranksters “do not appear to be using any voice modulation, primarily focusing on the targets’ lack of familiarity with the contact and the element of surprise.”

The two Russians have also denied working for the Kremlin in a 2016 interview with The Guardian.

Source