AI benefits/risks

How to ward-off fraudulent job seekers propped up by AI  

High-resolution painting of a red brick wall illustrating a career expo for AI jobs, where job seekers explore tech job markets, resumes, and interview opportunities. Background color

COMMENTARY: Hiring qualified people has become a growing issue in the cybersecurity industry. Thousands of people are looking for new job opportunities, eager to pursue available security positions across a diverse range of industry sectors.

Yet it’s become increasingly more difficult to hire potential candidates safely and securely. Not only do hiring teams need to contend with extensive processes and vetting for multiple candidates at once, but managers now also face increased pressure with new security threats. For example, work scams have surged by a whopping 118% this past year.

[SC Media Perspectives columns are written by a trusted community of SC Media cybersecurity subject matter experts. Read more Perspectives here.]

As job applicants increasingly use AI to assist with applications, it has led to a surge in candidate impersonation and fraud. This has made robust verification methods more crucial than ever as the need for talent verification across industries has risen.

How AI drives the job scam surge

Candidate impersonation has become one of the most detrimental job scams. Also known as proxy candidates, it’s a classic bait-and-switch pattern. This happens when people aren’t who they claim to be when applying for a job. Candidate impersonation happens when individuals rely on technology such as AI, or hire other people. to falsify or enhance their identity, achievements, certifications or skills during the hiring process.

This may include generating false or synthetic work qualifications, experience, qualifications, and skills as part of their resume. This may also include leveraging AI on-the-fly to answer questions during an interview to appear knowledgeable and sophisticated in experience. Another attack vector includes using AI tools to impersonate individuals' voices and appearances on a much wider scale. Candidate impersonation may also extend to using a third party to represent them during interviews and when taking tests or providing sample projects.

Over the past few years, companies started taking hiring verification more seriously. Many organizations have implemented technology to help address this challenge in the hope that it will help weed out the bad or “fake” candidates from the good or “real” ones. In fact, earlier this year, LinkedIn announced another iteration to its verification program, the “Verified Recruiter” badge, to further filter out fraudulent job seekers.

While measures such as this one help prevent fraudulent applicants from applying, there’s still more work ahead if enterprises want to combat AI-generated job fraudsters. Verification and deepfake detection methods and tools, such as those delivered with decentralized identity products, can let organizations add an extra layer of security.

Verifying identity to secure true talent

The saturation of the job market has increased the risk of candidate fraud as job seekers look for new ways to fabricate their backgrounds and deceive employers during the hiring process.

Organizations can safeguard their hiring process and prevent candidate impersonation by using decentralized identity and verifiable credentials. With a decentralized identity approach, companies can carefully control authentic verifiable credential issuance, delegation, and portability for each and every individual. This means that credentials are unique to the individual and they can’t get shared. Moreover, verifiable credentials and the data stored in them are then verified out-of-band to the method of engagement with the candidate such as a videoconference interview or phone interview.

Decentralized identity reduces the possibility of fraud and account takeovers by helping ensure the person behind the credential is who they claim. This essentially changes the entire attack surface by shrinking it, so fake applicants can’t get in. Since each person manages and stores their personal data presented and verified in the transaction, there’s no centralized source of information for cybercriminals to attack and exploit.

Decentralized identity has become  especially crucial in today’s digital landscape since new and emerging innovations, like AI, have complicated both hiring and security for enterprises. Consider the role AI now plays in making the authenticity of individuals less obvious. This will encourage organizations to adopt decentralized identity, and also present layered approaches to authentication to ensure people are who they say they are before they are issued a credential.

Businesses must combat job scams as AI heats up

While fabricating their work experience may seem harmless, AI-backed applicants are costing businesses time, money, and productivity, all while disrupting the quality of their workforces. The potential pitfalls of candidate impersonation are extremely profound, causing long-term damage to brand reputation, severe risk to enterprise networks and systems, the hiring of unqualified candidates, new training and onboarding costs, and even legal consequences.

Organizations need to decide if they want to invest in tools to identify potential job scammers and fake applicants now or several months later, but be warned: waiting to invest could mean that the damage has already been done. Businesses can take control of their security systems by focusing on talent verification and investing in the right technology.

It’s important to note that even with robust security measures, threat actors often find ways to infiltrate operations and protocols. In light of this, organizations must stay one step ahead of attackers by continuously evaluating and investing in tools to thwart potential risks.

It’s no longer a want – but a need – for businesses to adjust their methods to address the influx of fraudulent candidates. While there’s no definite way to spot fake applicants every time, the right safeguards and practices can make the difference between dodging a hiring mishap and bringing a qualified new member on board.

Darrell Geusz, product lead, PingOne Neo, Ping Identity

SC Media Perspectives columns are written by a trusted community of SC Media cybersecurity subject matter experts. Each contribution has a goal of bringing a unique voice to important cybersecurity topics. Content strives to be of the highest quality, objective and non-commercial.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.

You can skip this ad in 5 seconds