AI in recruiting: An AI Fable and a Practical Playbook for Helping the “Joshes” Among Us
I want to tell you a story about someone I’ll call Josh. He doesn’t exist as a single person; he’s an amalgam of conversations, quiet encounters at conferences, overheard comments in cocktail lounges, and private phone calls. Josh’s story is about work, loss, anger, and the complicated human fallout when technology moves faster than our lives can adapt. It’s also about what people who care about AI—especially those of us working in hiring, learning and workplace design—can do to prevent talented people from being left behind. Along the way I’ll talk about practical interventions, how tools can help, and why conversations matter. And yes, this will include how AI in recruiting can be part of both the problem and the solution.
Meet Josh: A Modern Worker Facing a Modern Displacement
Josh graduated with a journalism degree in 2018. He wanted to tell stories. He imagined growing into a celebrated columnist. He landed a job at a small internet newsroom—exactly the kind of scrappy beginning that feels like the right first step for a storyteller. Then circumstances intervened. COVID. Economic shifts. And then AI—tools that could automate content creation, scale copy production, and monetize at speed—arrived. The small newsroom shut down. The whispered, then public, layoffs hit. Josh went from building a career trajectory to wondering how to pay rent.
He moved back to his parents’ home in Vermont. He took some online courses. He tried to make contacts. He tinkered with projects. But every time someone told him to “learn to code” or “pivot into data,” it felt like a betrayal of why he studied journalism in the first place. The frustration hardened into a bitter shorthand: “AI took my job.”
Why this story matters beyond journalism
Josh’s story is not about a single industry. It is a human story repeated in dozens of forms across marketing, product roles, customer research, and yes, areas tied to hiring and talent. When tasks are automated, the jobs that relied on those tasks often evaporate—or at least, they change dramatically. Those changes have emotional, financial, and social consequences. If we—AI practitioners, hiring managers, recruiters, and teammates—ignore those consequences, we risk widening the divide between people who feel empowered by technology and those who feel dispossessed by it.
That’s where conversations and practical help come in. And where the systems of talent—especially where AI in recruiting is applied—can either deepen inequality or help bridge it.
The Emotional Landscape: Resentment, Distrust, and the “No” Reflex
One important pattern I’ve observed: people like Josh often reject the reskilling narrative when it’s offered too casually. They’re suspicious of quick fixes. They resent the advice that sounds like “just learn to code” as if that were the only path forward. Sometimes they refuse help outright, because accepting help can feel like admitting defeat.
“AI took my job.”
That sentence carries a weight that’s equal parts literal and symbolic. Literally, automation replaced the work Josh was hired to do. Symbolically, it feels like an erasure of the dream he had for himself—his raison d’être as a storyteller. That’s why empathy must precede advice. You cannot parachute a curriculum into someone’s life and expect transformation; you must first meet them where they are emotionally.
Why listening beats lecturing
Begin by listening. Listen without trying to solve. Let Josh voice his loss, his anger, and his skepticism. Then, gently, ask if he’s open to a conversation about how the world has shifted and where his core skills might still matter. Not everyone will say yes—but many will, and that’s where tangible change begins.
Dream Flexibility: Change the Game Board, Not the Person
The core question isn’t “Can Josh be retrained?” It’s “Are Josh’s dreams flexible?” Too often we equate career advice with identity change: “You must become a developer, or you’re finished.” That’s both cruel and false.
Work is increasingly organized around problem spaces rather than rigid job titles. The silos that used to define careers—“marketing person,” “reporter,” “store manager”—are dissolving into cross-functional problems that require adaptable, curious people. If you frame the change properly, you can invite Josh to think of his skills as transportable assets rather than obsolete labels.
“Are you open to your dreams shifting?”
That question reframes failure as iteration. It doesn’t demand betrayal of values; it invites a restatement of purpose. A journalist who wants to tell human stories can do that in product narratives, user research, community building, documentation, or policy writing. The medium changes. The core impulse—to illuminate human experience—remains.
Personal Reinvention: Evidence That Change Is Possible
It’s not theoretical. People have repeatedly reinvented themselves in response to tech shifts. Consider a career that moved from nonprofit grant management to e-commerce to product—roles that were each remade by automation and tooling. The point is not to suggest everyone will or should follow the same zigzag. The point is to show that reinvention—though messy, uncomfortable, and often nonlinear—is possible.
Technology historically rearranges work: certain tasks become automated, while the higher-order problem solving and human judgment become more valuable. But the transition period is real and painful for many. That’s why we need systems—community-based reskilling, apprenticeships, hiring experiments—that reduce the friction for people who want to try something new.
Practical ways to support reinvention
- Start with microprojects: Encourage small public work that demonstrates capability (a short essay series, a newsletter, a customer interview reel).
- Build bridges, not walls: Pair displaced workers with mentors inside companies who can open up networks and opportunities.
- Create low-friction pathways: Apprenticeships, paid internships, and fellowship programs that don’t require perfect resumes but do require curiosity and persistence.
- Accept the scrappy: Early-stage work might be kludgy—allow that and look for growth signals, not polished deliverables.
Where AI Helps—and Where It Hurts
This is the complicated part. Tools that include AI in recruiting, candidate matching, resume parsing, and automated interview bots can streamline hiring and reduce bias—but they can also entrench new inequalities. Automated systems trained on historical hiring data can reproduce past exclusions; automated content tools can commoditize creative labor.
So how do we reconcile these tensions? The answer is not to halt innovation but to adopt guardrails and redesign processes intentionally.
How AI in recruiting can be part of the solution
- Personalized upskilling: Use AI to generate custom learning paths for individuals like Josh, focusing on transferable skills—storytelling, research, interviewing, synthesis.
- Skills-based matching: Instead of relying on title-based filters, use AI to identify candidates with relevant problem-solving patterns and potential to grow into roles.
- Augmented hiring processes: Use AI to reduce administrative friction (scheduling, screening paperwork) so recruiters can spend time building relationships and coaching candidates.
- Bias auditing: Build routine audits into hiring tools to flag and correct patterns that disadvantage people from nontraditional backgrounds.
When applied thoughtfully, AI in recruiting can turn the hiring funnel into a development pipeline. Instead of serving only as a gatekeeping mechanism, it can identify promise and create pathways into roles.
AI Agents For Recruiters, By Recruiters |
Supercharge Your Business |
Learn More |
Where AI in recruiting can cause harm
- Opaque scoring: When candidates are screened by opaque models without human oversight, promising people get lost.
- Title bias: Systems that map old titles to hiring fit will exclude people who've had nonlinear careers.
- Commoditization of craft: Automated tools that replicate creative output can reduce demand for entry-level roles that once served as training grounds.
The fix requires transparency, human-in-the-loop design, and an organizational commitment to moving beyond crude efficiency metrics in hiring.
Practical Playbook for Managers, Recruiters, and Colleagues
If you’re reading this and you know a Josh—someone who feels displaced, resentful, or stuck—here is a practical set of steps you can take. Many of these are small, human-centered actions that don’t require vast budgets, only attention and willingness.
- Listen first. Schedule a conversation that’s explicitly non-diagnostic. No pitches. Let the person talk about their feelings and hopes.
- Validate the loss. Acknowledge that technology can take away work and that anger and grief are reasonable responses.
- Ask permission to explore options. Framing helps: “Would you be open to brainstorming what parts of your work you loved most?”
- Map transferable skills. Identify storytelling, interviewing, curiosity, empathy, analysis—skills useful across product, policy, search, and community roles.
- Design low-risk experiments. Set a 4–8 week micro-project with a clear output—an article series, a portfolio of user interviews, a curated newsletter—that demonstrates capability.
- Leverage AI as an assistant, not a replacement. Use tools to accelerate learning: practice interviews with models, draft outlines, annotate texts. Demonstrate how AI in recruiting can free up time to focus on growth rather than gatekeeping.
- Build introductions. The most valuable resource is social capital. Make warm intros to people who can advise or hire.
- Create pathways inside your company. If you’re a manager or recruiter, experiment with hiring people into stretch roles or apprenticeship programs and measure outcomes.
How to Use AI as a Practice Partner (Without Alienating People)
One of the most overlooked benefits of modern models is their role as a practice partner. Want to rehearse interviews? Use a model to role-play. Want to learn an unfamiliar textbook? Ask the model to summarize and explain diagrams. These are low-cost ways to compress learning cycles.
But if you’re dealing with someone who resents AI, you must be sensitive about how you introduce these tools. Start by framing AI as a utility rather than a threat: “I know this tool is part of what hurt you, but here’s a tiny way it can act as a coach. Want to try, no strings?”
There are concrete exercises that are nonthreatening:
- Mock interview practice: The person controls the session length and topic.
- Story refinement: Draft a short piece, then ask the model to suggest structural edits—retain human voice, keep the author in control.
- Diagram explanation: Take a photo of a complex diagram and ask the model to walk through it in plain language.
Organizational Policy: From Band-Aids to Structural Support
Individual acts of kindness help, but systems matter. Companies building and deploying AI, and those that hire people impacted by automation, should adopt a few structural commitments:
- Invest in bridge programs that offer paid transition roles rather than unpaid internships.
- Audit recruiting pipelines that use AI in recruiting to ensure they’re not closing doors on people with unconventional careers.
- Fund apprenticeships that prioritize mentorship and network access over polished resumes.
- Offer internal mobility pathways explicitly designed for people shifting careers.
- Measure outcomes in human terms: job retention, wage growth, satisfaction—not just time-to-hire.
These are not low-cost in the short term, but they are investments in a stable, inclusive workforce and a healthier society.
Hard Truths and a Call to Empathy
It’s tempting to offer technocratic solutions: more courses, free bootcamps, self-paced learning. Those help, but they’re insufficient when the emotional piece is unaddressed. Josh does not need only a curriculum; he needs people who will vouch for him, who will listen, and who will create room for messy reinvention.
“We’ve taught the sand to think, and now everything is different.”
That line captures the trade-off we live with. The systems we designed to automate tasks now think in some limited ways. That is astonishing and opens enormous possibility. It is also destabilizing. The key challenge is to make that revolution livable for everyone.
Three commitments to make today
- Listen to the displaced: Start a program in your company to have recruiters and managers meet with displaced workers to understand barriers.
- Design hiring with empathy: Reconfigure job descriptions to emphasize skills and potential over precise past titles.
- Use AI in recruiting to expand opportunity: Redirect automation gains to create time and budget for apprenticeships and paid transitions.
Parting Thoughts: Don't Leave the Joshes Behind
There will be more stories like Josh’s. Some will be angrier, some quieter, some hopeful. The choices we make now—about how we design hiring systems, how we introduce AI into workflow, and how we treat people who feel left behind—will determine whether our future is fractured or shared.
AI in recruiting can be a tool that locks people out, or it can be a scaffolding that helps people in. It can be a gatekeeper or a bridge. The difference lies in design choices and human behaviors: whether we center empathy, whether we measure success by people's lives and not just efficiency metrics, and whether we invest time into helping someone translate passion into a new kind of work.
If you know a Josh, do something small today: listen, validate, and ask if they’re open to exploring a different path together. Help them build one public piece of work that demonstrates value. Introduce them to someone, even over email. If you’re a recruiter or manager, rethink job specs and pilot an apprenticeship. If you build tools that embed AI in recruiting, bake transparency and human oversight into the product from day one.
The story of technology is also the story of community. We’ve taught the sand to think. Now let’s teach ourselves how to care for each other as the sand learns to speak.
Key Takeaways
- Job displacement is real and painful—acknowledge it before prescribing solutions.
- Dreams can and should shift; help people map what matters to new problem spaces.
- AI in recruiting can help by identifying potential and creating learning pathways—but it can also harm if implemented without empathy.
- Practical interventions (microprojects, apprenticeships, mentor introductions) matter more than one-size-fits-all bootcamps.
- Design hiring systems that measure human outcomes and enable second acts.
Cheers to the hard conversations and the small actions that add up. If you care about building a future where technology lifts more people than it displaces, start by talking to the Josh in your life.