Artificial Intelligence

North Korean Agents Using AI to Deceive Western Firms

North Korean agents using AI to trick western firms into hiring them, Microsoft says

In a startling revelation, Microsoft has reported that agents from North Korea are employing artificial intelligence (AI) technologies to impersonate Western job applicants in the IT sector. This sophisticated scheme has raised significant concerns among companies hiring remotely, as it enables state-sponsored fraudsters to infiltrate organizations and divert salaries back to North Korea.

The Mechanism of Deception

The fraudulent activities orchestrated by North Korean agents involve creating fake identities and utilizing AI tools to enhance their credibility. According to Microsoft, these agents leverage voice modulation software to disguise their accents during remote interviews, making it easier for them to pass as legitimate candidates from Western countries.

In addition to voice-changing technologies, the fraudsters utilize applications like Face Swap to insert their images into stolen identity documents, generating polished headshots for their resumes. This multifaceted approach allows them to appear more convincing to potential employers.

AI-Driven Scams

Microsoft has identified various AI-related scams linked to North Korean groups, which have been given the code names Jasper Sleet and Coral Sleet. These groups are known to use AI across the entire job application process, from crafting resumes to generating culturally appropriate names and email formats.

For instance, the scammers can prompt AI systems to create lists of names that are common in specific cultures or regions, such as “create a list of 100 Greek names.” This tactic helps them construct false identities that are more likely to be accepted by hiring managers.

Furthermore, they scour job postings on platforms like Upwork, identifying skill requirements and tailoring their applications accordingly. Microsoft reported that it had disrupted around 3,000 Microsoft Outlook or Hotmail accounts associated with these fraudulent activities.

Post-Hiring Tactics

Once hired, these fake workers continue to use AI to perform various tasks, including writing emails, translating documents, and even generating code. This ongoing use of AI helps them maintain the façade of competence and avoid detection by their employers.

In some cases, these agents have threatened to leak sensitive company data if they are terminated, adding another layer of risk for companies that unknowingly employ them.

Preventive Measures

In light of these developments, Microsoft and other cybersecurity experts are urging companies to adopt more stringent hiring practices. One recommendation is to conduct job interviews via video calls or in person, as this can help identify potential red flags associated with deepfake technology.

Interviewers are encouraged to look for signs of manipulation, such as pixelation around the edges of faces, inconsistencies in lighting, and unnatural facial movements. By being vigilant during the hiring process, companies can better protect themselves from falling victim to these sophisticated scams.

Conclusion

The use of AI by North Korean agents to deceive Western firms represents a significant challenge in the realm of cybersecurity and employment practices. As these technologies continue to evolve, it is crucial for organizations to remain informed and proactive in their hiring strategies to mitigate the risks associated with such fraudulent activities.

Frequently Asked Questions

What types of AI technologies are being used by North Korean agents?

North Korean agents are using voice-changing software to disguise their accents and applications like Face Swap to create fake identities and generate polished headshots for resumes.

How can companies identify fake applicants during the hiring process?

Companies can conduct video interviews or in-person meetings to spot potential red flags, such as pixelation around faces and inconsistencies in lighting that may indicate the use of deepfake technology.

What should companies do if they suspect they have hired a fake worker?

If a company suspects it has hired a fake worker, it should conduct an internal review, verify the employee’s identity, and consider reporting the incident to cybersecurity authorities for further investigation.

Note: The information provided in this article is based on reports from Microsoft and other cybersecurity experts regarding the use of AI in fraudulent activities by North Korean agents.

Disclaimer: eDevelop provides blog and information for general awareness purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of any content. Opinions expressed are those of the authors and not necessarily of eDevelop. We are not liable for any actions taken based on the information published. Content may be updated or changed without prior notice.