News SpotlightEntry-level jobs are being automated. AI is competing for jobs with college graduates as labor conditions have declined since the start of the year, even for M.B.A. grads (The Atlantic). AI helps job seekers change careers. New tools give users insight into how skills in one field can transfer to a different one (The Wall Street Journal). Pay-for-performance incentivizes productivity. Google, Amazon, and other companies are rewarding top performers to get higher rewards and paying lower performers less (Inc.) Stat of the WeekA new study finds that 64% of employers use skills-based hiring, 38% screen candidates based on GPA, and only 11% use AI. The strong adoption of skills-based hiring shows that employers value demonstrable abilities over traditional credentials. HR should refine job descriptions, implement effective skills assessments, and train hiring managers to evaluate competencies. The continued reliance on GPA suggests an opportunity for HR to educate hiring teams on the limitations of this metric in predicting job performance and advocate for more skills-focused evaluations to broaden the talent pool. And the surprisingly low adoption of AI in hiring indicates a significant area for HR to explore, as AI tools can potentially streamline sourcing, screening, and matching candidates based on skills, ultimately improving efficiency and reducing bias in the recruitment process. Deep Dive ArticleHow to Combat Deepfake Candidates and Hiring Scams in the Age of AIA New Hiring ThreatIn a labor market already shaped by economic uncertainty, skill shortages, and remote work, employers now face a new and deeply unsettling challenge: deepfake job applicants. These aren’t just embellished résumés or minor resume stretches—these are fraudulent candidates using AI to fabricate entire professional identities, fool interviewers, and infiltrate organizations. With remote roles increasing, these AI-assisted scams are on the rise, threatening not only hiring integrity but also corporate cybersecurity. Generative AI tools now allow bad actors to generate synthetic voices, forge professional headshots, fake employment histories, and even create deepfake video avatars. Some scammers have gone as far as using ChatGPT in real-time interviews to generate responses to recruiters’ questions. Others substitute a qualified proxy for the actual candidate in live interviews. What begins as a fake hire can end in data breaches, ransomware attacks, or the theft of proprietary company information. And it’s not just isolated incidents. According to Gartner, by 2028, one in four job candidates could be fake, thanks to the rapid accessibility and sophistication of generative AI. The implications of this trend are far-reaching, affecting not only HR and recruiting teams but the entire digital trust infrastructure of organizations worldwide. A Cybersecurity Crisis in DisguiseHiring scams today go far beyond lying on a résumé. Fraudulent candidates are now part of coordinated efforts, sometimes even linked to foreign governments, to infiltrate businesses for financial or strategic gain. For example, the U.S. Department of Justice has uncovered multiple cases of North Korean operatives using fake identities to secure remote jobs with U.S. companies. The goal? Channeling funds back to the North Korean government and its nuclear weapons program. These candidates present sophisticated digital profiles: fake LinkedIn pages, AI-generated headshots, cloned voices, and polished personal websites. Once onboarded, these impostors may install malware, demand ransom, or siphon sensitive data. What’s most concerning is that they can slip through even experienced hiring teams, particularly when screening for remote roles where in-person meetings aren’t feasible. Cybersecurity firms and tech startups are seeing a surge in fake applicants due to the valuable data and infrastructure they possess. One case involved a security firm that discovered its new remote hire was a deepfake, mimicking a real job seeker with eerily realistic credentials and behavior. The warning signs are often subtle—laggy video, inconsistent backstories, or an overdependence on scripted answers. But in a fast-moving hiring process, these can be easy to overlook. How AI Tools Enable Deception at ScaleWhat makes this threat especially dangerous is that AI allows job scammers to scale their deception. Generative AI can now be used in nearly every step of the hiring process:
The result is a pipeline of applications that look too good to be true—because they are. Legitimate candidates may get edged out by fakes using AI to meet or exceed every hiring benchmark. This “arms race” of digital deception puts hiring managers at a disadvantage and creates an uneven playing field for honest applicants. The Real-World Impact on Businesses and TeamsBringing on a fraudulent employee doesn’t just waste time and salary dollars. The consequences can be devastating. Once inside, bad actors can access sensitive client information, intellectual property, or internal systems. In some cases, they’ve installed backdoors into company infrastructure or initiated ransomware attacks shortly after being hired. There’s also a long-term cost in terms of trust. Businesses risk reputational damage, client loss, and legal exposure. Moreover, the presence of fake employees can erode morale among real team members—especially if coworkers sense something is “off” but feel powerless to escalate concerns. HR leaders are now finding themselves on the front lines of cybersecurity. It’s no longer just about screening for cultural fit or technical skill. It’s about ensuring the person on the other end of the Zoom call is who they claim to be. Best Practices for Verifying Candidate AuthenticitySo, how can businesses protect themselves in this new environment? The good news is that several effective strategies can help detect fraud before it's too late: 1. Scrutinize LinkedIn Profiles 2. Ask Cultural or Location-Based Questions 3. Insist on Live or In-Person Interviews 4. Check ID Thoroughly 5. Implement Two-Factor Verification for Onboarding The Numbers Game: How Fake Applicants Hurt Real OnesBeyond security risks, there's a broader labor market issue. The rise in fake applicants is flooding job pipelines, making it harder for legitimate candidates to break through. Automated systems struggle to distinguish genuine resumes from AI-generated ones, and overwhelmed hiring teams may unintentionally dismiss real talent. This flood also creates unnecessary delays, burdens screening teams, and increases the risk of burnout among recruiters. Worse, the presence of fakes can distort hiring metrics, making it appear as if quality candidates are applying but failing later stages. That illusion can mislead decision-makers about the effectiveness of their sourcing strategies. Conclusion: Building Trust in a Deepfake WorldAs AI evolves, so do the threats facing today’s workforce. Fake job seekers powered by generative AI are a growing challenge—blurring the line between convenience and deception, efficiency and exploitation. But by staying vigilant, adopting new verification practices, and understanding the tools scammers use, businesses can fight back. Hiring in the AI era requires more than good instincts—it requires a cybersecurity mindset. With the right balance of human insight and technological checks, companies can protect their teams, customers, and reputations from the growing tide of deepfake deception. Thanks for reading — be sure to join the conversation on LinkedIn and let me know your thoughts on this topic! Quote of the Week“Always remember that you are absolutely unique. Just like everyone else.” |
Check out the previous issues of the Workplace Intelligence Insider newsletter below and subscribe now to get new articles every Monday.
News Spotlight Office amenities are being used for RTO. Companies are offering free food, a spa, sports activities, and more in order to lure workers back to offices (Chicago Tribune). The return of shared workspaces. More companies are ditching individual desks and embracing shared spaces (Fortune). ESOPs gain popularity as an employee benefit. Employers gravitate toward employee stock ownership plans for retention purposes (CNBC). Stat of the Week A new study finds that leaders expect their...
News Spotlight Monitoring brain activity to boost performance. Meta's new helmet reads brain activity non-invasively, allowing users to translate their thoughts into text and increasing productivity (Allwork Space). Employees can turn their PTO into cash. A growing trend allows employees to exchange unused paid time off for options like cash, retirement funds, or even help with student loan payments (Fortune). Gen X is worried they’ve become obsolete. Experienced professionals in creative...
News Spotlight Well-being programs are making a comeback. Companies are promoting time off days to promote employee well-being, such as a “me day” or “recharge days” (New York Post). Government workers enter the job market. There’s been a spike in applications from current and former government workers, which will further increase the competition for jobs (Marketplace). Gen Z seeks mini-retirements. Young professionals choose to take extended career breaks and fund them with their savings,...