Back to Blog
    Graphic of a person working at a computer
    AI & HR Technology
    Market Intelligence

    By Graham Thornton

    Stop Hiring for AI Roles That Are Actually Skills

    Most AI roles are skills your existing team should learn, not new hires. Learn how to train instead of recruit and avoid wasting millions on unnecessary roles.

    The AI Workforce Gap: Why Most Companies Are Hiring for Jobs That Won't Exist in 3 Years

    Last month, a Fortune 500 client asked us to help them hire an "AI Trainer." When I asked what the role would actually do, they sent me a job description that could have been written for a data analyst, a prompt engineer, an ethics specialist, or a QA tester—depending on which paragraph you read.

    They didn't know what the job was. They just knew the board was asking about AI hiring.

    This is the pattern we're seeing everywhere: companies feeling pressure to hire "AI people" before they've defined what work needs to be done, whether existing teams could do it with training, or what success even looks like. The job description becomes a Frankenstein document: part data analyst, part prompt engineer, part ethics specialist. Why? Because they're hiring to check a box, not solve a problem.

    It's what Harvard Business Review calls "workslop." Work created for the sake of appearing productive rather than driving actual outcomes. Except in this case, it's not just busywork. It's a six-figure role with a six-month hiring process.

    This is happening everywhere. Companies are posting jobs for "Prompt Engineers" and "AI Ethics Specialists"—roles that didn't exist two years ago and may not exist two years from now—without defining what success looks like, what skills actually matter, or whether they even need a dedicated person.

    At Talivity, we help organizations with strategic workforce planning—the unglamorous work of figuring out what jobs you'll actually need before you start hiring for them. Here's what we're seeing: AI isn't just changing job descriptions. It's exposing that most companies have little idea what work actually requires a human.

    This article breaks down the new roles emerging from AI adoption, the skills that actually matter (vs. the ones everyone's talking about), and the workforce planning process that helps you get ahead of the curve instead of constantly reacting to it.


    The New Roles Everyone's Hiring For (And What They Actually Do)

    If you've been in talent acquisition meetings this year, you've heard these titles thrown around. Here's what they actually mean—and whether you need them:

    AI Trainers & Ethics Specialists

    What everyone thinks they do: Teach AI systems to be less biased and more ethical.

    What they actually do: Annotation, labeling, and quality control on training data. (Keep as-is, no m-dash)

    Do you need one? Probably not as a dedicated role. If you're building proprietary AI models, yes. If you're implementing vendor tools, this is a feature you buy, not a person you hire.

    Prompt Engineers

    What everyone thinks they do: Craft magical prompts that make AI do amazing things.

    What they actually do: Structured query design and output optimization. Basically, translating business requirements into clear instructions for AI systems.

    Do you need one? Not as a standalone role. Prompt engineering is a skill your existing analysts, researchers, and domain experts should learn, not a new department.

    AI Quality Assurance Analysts

    What everyone thinks they do: Test AI outputs for accuracy.

    What they actually do: Validation, edge case testing, and compliance checking. They make sure AI-generated outputs meet standards before they go into production.

    Do you need one? Yes, if you're deploying AI in high-stakes environments (healthcare, finance, legal). But it's probably an expansion of your existing QA function, not a net-new role.

    Data Curators & Labelers

    What everyone thinks they do: Organize data for AI training.

    What they actually do: Clean, tag, and structure data so AI models can learn from it. Think of it as librarian work for machine learning.

    Do you need one? Only if you're building custom AI models. Most companies are buying pre-trained systems where this work is already done.

    AI Product Managers

    What everyone thinks they do: Build AI products.

    What they actually do: Bridge the gap between what AI can do and what the business actually needs. They translate technical capabilities into user requirements and ROI.

    Do you need one? Yes—but not as many as you think. One strong AI Product Manager can guide implementation across multiple teams. Don't hire a whole department before you've deployed your first AI tool.

    Digital Twin Specialists

    What everyone thinks they do: Create virtual replicas of physical systems.

    What they actually do: Model, simulate, and optimize complex processes in virtual environments before implementing them in the real world. Requires deep expertise in IoT, data analysis, and system design.

    Do you need one? Only if you're in manufacturing, logistics, or infrastructure. For most companies, this is a consulting engagement, not a hire.


    The Pattern: Many "AI Roles" Are Skills, Not Jobs

    Here's what we're seeing in workforce planning diagnostics: companies are creating job titles for what should be training programs.

    Prompt engineering? That's a skill your marketers, analysts, and researchers need to learn. Not a new department.

    AI ethics? That's a governance framework your legal and compliance teams need to understand. Not a standalone role (unless you're building foundation models).

    Data curation? That's an expansion of your existing data team's responsibilities. Not a net-new function.

    The companies navigating AI hiring well aren't the ones posting 50 new "AI [Insert Job Title]" roles. They're the ones asking: "What does our current workforce need to learn to do their jobs in an AI-augmented environment?"

    That's a training question, not a hiring question.

    The Exception That Proves the Rule: When AI Roles Actually Make Sense

    Not every AI title is unnecessary. Some roles genuinely require dedicated focus because they bridge multiple teams, require deep technical knowledge, and create compounding value across the organization.

    Take Emily Mabie's role as AI Automation Engineer at Zapier. Her job isn't just "using AI tools." She embeds with teams to map real workflows, builds AI-powered automations that multiple departments can use, creates repeatable solutions that become customer-facing products, teaches teams to spot their own automation opportunities, and measures business outcomes like hours saved, error reduction, and cycle time.

    Notice what makes this a role instead of a skill: She builds for HR, but the solutions scale to customers and other teams. She needs to understand both AI capabilities and HR workflows deeply. Her work creates measurable outcomes (Remote's Marcus Saito in a similar role saved $500K in hiring costs by auto-resolving 27.5% of IT tickets). And this isn't "implement AI and move on." It's continuous optimization and enablement.

    The difference? Companies like Zapier, Remote, and ActiveCampaign didn't post a job requisition for "AI Automation Engineer" because a Gartner report told them to. They identified specific, measurable problems (IT ticket volume, sales workflow inefficiency), piloted AI solutions to prove value, recognized the need for someone to scale and maintain those solutions across the org, and then created the role.

    That's the opposite of what most companies are doing. Most are posting "AI roles" before they know what problem they're solving.

    Think of It Like Email in 2005

    Imagine if companies in 2005 had responded to the rise of email by creating a new department: "Email Communication Specialists."

    Their job? Craft emails, manage inboxes, train people on Outlook, and ensure email etiquette compliance.

    It sounds ridiculous, because email wasn't a role. It was a tool that everyone needed to learn.

    That’s what's happening with AI right now across many organizations.

    "Prompt Engineer" is the 2025 version of "Email Communication Specialist." It's a skill your marketers, analysts, and researchers need to develop, not a separate department.

    The exception? If you're building Gmail (the platform itself), then yes, you need email engineers. If you're building foundation AI models, you need prompt engineers.

    But if you're a company using AI tools (not building them), prompt engineering is a skill to train, not a role to hire.

    The test: If you can't articulate the specific business outcome the role will drive in the first 90 days, you're not ready to hire for it. You're reacting to board pressure, not solving a business problem.


    The Skills That Actually Matter (And How to Build Them)

    Spoiler alert: AI isn't a skill. It's a tool. Let's forget the buzzwords and focus on what actually differentiates high-performing teams in AI-augmented environments:

    1. Complex Problem Solving (Not Just "Critical Thinking")

    AI can analyze data and suggest solutions. Humans still need to decide which problems are worth solving.

    What this looks like: A recruiter using AI to screen resumes still needs to know when the algorithm is optimizing for the wrong criteria. A finance analyst using AI forecasting still needs to spot when the model's assumptions don't match market reality.

    How to build it: Scenario planning exercises. Root cause analysis training. Post-mortems on decisions where AI recommendations were wrong.

    2. Emotional Intelligence (Especially in High-Stakes Conversations)

    As AI handles transactional interactions, the work that remains is relationship-heavy: negotiation, conflict resolution, change management, leadership.

    What this looks like: AI can draft the rejection email, but a human needs to deliver bad news to a senior candidate. AI can flag a performance issue, but a manager needs to have the difficult conversation.

    How to build it: Coaching, role-play simulations, and feedback loops. This isn't a workshop—it's ongoing development.

    3. Digital Collaboration (Beyond "Using Zoom")

    Remote work and AI tools have made async collaboration the default. High performers know how to communicate clearly in writing, manage projects across time zones, and use tools to amplify their work without creating "workslop." That's AI-generated content that looks polished but lacks substance.

    What this looks like: Knowing when to use AI to draft the first version vs. when to write it yourself. Knowing which decisions need a meeting vs. which can be handled async. Knowing how to give feedback that improves outcomes, not just compliance.

    How to build it: Communication frameworks (BLUF, pyramid principle), tool training, and ruthless process simplification.


    Three Questions Before You Post Another "AI Role"

    Before you create a new job requisition, answer these:

    1. Is this a role or a skill?

    If the work can be taught to your existing team in 3-6 months, it's a skill. Don't hire for it—train for it.

    2. Will this role exist in 3 years?

    AI capabilities are evolving fast. "Prompt Engineer" might be obsolete by 2027. If the role is tied to a specific tool or technique (not a business outcome), think twice.

    3. What does success look like?

    If you can't define measurable outcomes, you're not ready to hire. "Help us with AI strategy" isn't a job description. It's a consulting engagement.


    How We Help Organizations Navigate This

    When a client tells us they need to "build an AI workforce," we start with a diagnostic. Here's what happens:

    We map your workflows

    Not what the process doc says. What people actually do.

    We assess your AI readiness

    What tools do you already own? What's actually being used? Where are the gaps vs. the shelfware?

    We define your human-value work

    The work that requires judgment, relationships, and strategic thinking. The stuff AI can't do (or shouldn't).

    We build your talent roadmap

    Not a hiring plan. A mix of training, reorganization, and selective hiring tied to specific outcomes.

    Recent results:

    • Financial services client: Avoided $3M in "AI hiring" by upskilling existing teams

    • Manufacturing client: Reorganized workforce around AI-augmented roles, improving productivity 35%

    • Retail client: Built flexible talent strategy (core + contractors) that adapted as AI capabilities evolved


    Start Here

    Many companies will spend millions hiring "AI specialists" over the next 18 months. Half of those roles won't exist in three years. The other half will sit unfilled for six months because no one knows what success looks like.

    If you're an HR or business leader who suspects your AI hiring pressure is actually a training problem, we should talk. Or if you're tired of posting job reqs for roles you can't define.

    We help you figure out what work actually needs doing, whether your existing teams can do it with training, and when you genuinely need to hire versus upskill.

    Schedule Your Strategy Session