Part 1 – How AI Affects Recruiting | mOp-Ed

By Mark Oppenheim

cottonbro studio

Part 1 describes the pitfalls of over-reliance on AI and algorithmically driven recruiting platforms.


Mid-interview we decided that the candidate hadn’t created their own resume.  So I asked…

“Did you compose this resume?”
“How was it created?”
“Through a consultant who used GPTChat and other AI tools, then I edited it – looks good, doesn’t it!”

The AI-generated resume indeed “looked good” and had all the correct buzz words.  The candidate had great presentation skills – charismatic, articulate, connected well with high emotional IQ, made listeners (me and a colleague) feel good about ourselves and about the candidate.  Written correspondence was fine.

Then we put aside our positive impressions to examine evidence and unpack the workflows required by the various past roles described in the resume… and the story began to fray.  Resumes in general don’t present workflow detail – it’s left to recruiters to know those workflows, unpack detail, and analyze the candidate’s answers in context of how different organizations function.  In this case workflows didn’t seem connected to claimed outcomes.  There were process gaps.  Operating attributes and constraints that inform real world decision-making didn’t seem to connect to detail on program, financial, revenue, development and operating decisions.  The positive results claimed by the candidate seemed to materialize magically rather than through planning.  There were too many unconnected dots.

So I asked, “Did you compose this resume?”

This first happened during the search for a CEO of a large arts organization.  We’ve since had similar encounters for CEO, CFO/COO and development searches for human services and education nonprofits, and we expect it to happen with increased frequency in other sectors.  In the similar cases that we’ve experienced, attributes described in candidate resumes that at first seemed aligned to the needs of our clients have turned out to be hollow or less than claimed.  We determine this through detailed and iterative questioning, and by carefully collecting data on candidates the old fashioned way: through person-to-person intelligence gathering. We never assume that on-line resume data is accurate or presented in context.

How recruiting platforms work

Recruiting platforms driven by algorithms and AI increasingly encourage users to shape resumes in ways that serve platform interests.  They do so guided by data amassed from two sources: other people’s resumes and job descriptions for particular roles.  Then these same platforms respond to queries from organizations by selecting those candidates whose resumes have the AI’s favored structure and AI-favored content.

A consequence of this is that resumes and job descriptions for different people and organizations start to look similar.  They use similar key words, phrases and points, and encourage users to avoid data that is too individual or contextualized, which would make it more difficult for platforms to deliver large clusters of resumes and job descriptions in response to queries.

User searches for jobs and candidates are also designed to maximize intelligence-gathering interactions.  “Give me more intelligence on you, and I’ll give you more data that seems useful.”

Over time, these platforms tilt outcomes to those using the platforms most, and there is a subtle shift in the employee recruiting and retention ecosystem to favor platform interests. It is in the platform’s interest to incentivize users to equate success with the highest possible volume of data exchanged.  It is also in the platform’s interest to disadvantage organizations and individuals who share less data with the platform.

If a platform shapes and then selects resumes, does this kind of recruiting yield the best leaders for organizations and the best career options for candidates?

What Are Algorithms and What Is AI?

Algorithms are rules.  It’s that simple.  The rules are used to solve a problem, provide an answer, present a decision.  The rule or algorithm can take different forms.  When you click on a YouTube link, the advertisement you see is based on intel collected on you and what advertisers pay for your gaze.  Algorithms determine which advertisement to display, and how long the advertisement plays before you can skip to what you really wanted to see.

AI combines lots of rules and data instantaneously, with the program’s selection of rules and data guided by other algorithms.  AI can select which algorithms to use, the kinds of logic and math to use, and can create new rules and logic on the fly.  When AI software is initially trained, or when it later responds to a query entered by a user, the data and instructions are in addition to the original programming.  But the original programming is central.

There are four things to keep in mind about AI and algorithmically driven recruiting platforms:

  • Your EVERY interaction with a platform is optimized for the profit or advantage of the investors, organization or country that controls the platform.
  • Platforms interacting with users self-adjust to further incentivize users to behave as the AI’s owners want users to behave. We’re all being conditioned by these platforms to respond as their programmers wish us to respond.
  • Answers that algorithms and AI present to users do not need to be true or in a user’s best interest.  An AI can be programmed to yield precise, fuzzy, incorrect or manipulative answers, whatever best advances the interests of platform owners.
  • A lot of energy goes into convincing users that the AI is working on their behalf and to their advantage. Highly developed AIs can even use different paths to deliver completely different answers to the same question posed multiple times.  This makes the AI more relatable to users, but every answer provided is intended to be consistent with the AI owner’s objective for user behavior.

In other words, the platforms are specifically designed to get us to think certain things and do certain things that advance the interests of platform owners.

Certain tech platforms are trying to use algorithms & AI to control how recruiting and retention unfolds in the marketplace.

A movement is underway by large tech companies to reshape recruiting so that algorithms and AI are at the center of recruiting, retention and other HR processes.  If computers are placed at the center of all HR activities, it can be tremendously profitable to those who own the computers.

This isn’t “Software as a Service” or SaaS, although it is often positioned as such.  Buyers of SaaS programs purchase software to perform functions they need, on data they own, for purposes they define.  In contrast, a new generation of recruiting platforms have a financial model where the interests of platform owners and users diverge in specific ways that are hidden from users.

First, the platforms collect, aggregate, analyze, sell and otherwise monetize your business and personal data. Their workflows are not designed to be efficient time-savers for business and individual users; rather, they are optimized to collect as much intel as possible for resale.  Once the data on individuals and organizations is collected – their use patterns, interests, personal information, relationships with others, hiring needs, etc – the collected data is controlled and monetized by the platforms.

Second, the platforms have a vested interest in optimizing employee turnover in ways that maximize platform profitability, not the profit or career paths of users.  Guided by a platform’s profit model, platform algorithms can determine when an individual has been in a position long enough to optimize platform profitability, can prompt that individual to consider alternatives while sharing more of their data with the platform, and can connect that individual to businesses that pay the platform most.  The platform can also collect information on business behaviors and their employees, analyze hiring patterns to predict business strategies and needs, identify competitors, and can sell intel from each competitor to others in various forms.

In Part 2, we’ll discuss this and other impacts that AI and algorithmically driven recruiting platforms have on organizations, careers and the kinds of talent that is advanced.


mOp-Ed pieces reflect diverse opinions about the nonprofit world and we welcome yours. If you would like to be a guest writer for mOppenheim.Org, please contact us for more information.

mOp-Ed, Recruiting, Technology