Part 3 – AIs and Recruiting: AI-Produced Resumes, Cover Letters, and Content | mOp-Ed

By Mark Oppenheim

Photo

Our two-part AI and Recruiting article seemed enough for a quick read on the topic. We’ve received positive feedback on how it helps boards and leaders navigate the new recruiting challenges introduced by various recruiting platforms. We thought we were done.

Then we received a submission by mistake from a candidate. 

Into their emailed response to our sourcing outreach, a candidate cut & pasted 10 pages of their dialogue with an AI. It seems that the candidate was having this AI produce submissions for our search… and also for a number of other searches conducted by others. Somehow we got it all. It provides rare insight into how AIs can author material that candidates can present as reflecting their thinking, style, knowledge and accomplishments.

Included in this email were attempts to draft material for different roles, in different cities, located in different regions of the United States. There were AI-shaped entries for cover letters, and AI-shaped entries for recruiting platforms. The AI was also asked to spice up particular parts of the resume. While we increasingly see content that is obviously an AI-shaded version of the truth, rarely do we see its construction.

There is never an asterisk to show that such material was actually constructed by AI. Instead, it’s left to us as analysts and investigators to move beyond candidates who rely on a combination of AI-generated nonsense and their ability to charismatically charm and spin a tale through Zoom and in-person interviews.

Let’s state the obvious here:

AI-generated resumes and cover letters market candidates by relying on an untrue representation of that candidate’s sector knowledge, accomplishments, style, and how they express themselves.

Maybe the AI only corrects the candidate’s incorrect grammar. Maybe the AI corrects grammar and adds buzzwords and phrases harvested from other people’s resumes. Maybe the AI does that and also adds sector-specific points and framing that the candidate doesn’t know. Sometimes an AI will add a point that is untrue or outside of that candidate’s experience. We daily encounter different shades of inauthentic material as we execute searches.

There is an old-fashioned term for candidates who must rely on computers to explain what they theoretically know and have accomplished: the empty suit.

We don’t want to embarrass anyone. The narrative below is compressed and shared in a way that removes identifying information, but in our records we have documentation of the full interaction.

This is how it starts…

AI Prompt: Hey there, great to meet you. I’m Pi, your personal AI. My goal is to be useful, friendly and fun. Ask me for advice, for answers, or let’s talk about whatever’s on your mind. How’s your day going?

Candidate: “Write a creative and compelling cover letter for this job…(using the following information).”

The candidate then cuts & pastes a whole section from one position description, and goes on to do it again for another position.

The AI generated this buzzword-laden text:

 “I am thrilled to apply for the xxxxxxx position, where I can bring my expertise in leading diverse teams, developing innovative strategies, and delivering exceptional experiences across multiple platforms. My career has been a dynamic journey, where I’ve embraced the power of data-driven insight…”

We’re increasingly seeing this kind of buzzword-laden, clichéd, and completely uninformative material. Ready-to-work job seekers are encouraged to run their resumes through buzzword generators in the hope of increasing their Linkedin search scores. Such characterless characterizations in resumes advance the clicks & views metric valued by platforms, but in the process candidates remove what is distinctive about themselves and replace it with the generic and inauthentic.

The candidate then asks the AI:

“Rewrite this Linkedin headline to be more dynamic and eye-catching, using no more than 120 characters…”

The candidate pastes in some generic text and there is a back & forth with the AI, including on which emojis the AI selects for such phrases as “extraordinary results” and “unleashing talent.”

Then comes a back & forth on what image of the candidate the AI would select for a Linkedin profile and how the AI would edit the image. It begs a question: how would an AI edit the image if its analysis indicated that candidates with a different look would have an advantage?

Next is a request that the AI rewrite an “About Me” section. There is exquisite irony in having a computer compose an “About Me” entry.

The next sequence of interactions is all about cover letters…

The candidate asks the AI to reshape and rewrite cover letters. Again this is kicked off by pasting in some text constructed by the candidate along with information from various position descriptions. In this instance, the candidate instructs the AI to produce a cover letter with “…a creative, inspiring tone.”

Resulting are computer generated one-page cover letters that reference, adjust and add to various text elements included in the original prompts. The AI version improves the grammar and adds buzzwords, terms of art and adjectives. The AI provides some emotionally satisfying language about innovation, vision, entrepreneurial spirit, being forward thinking, etc. The AI even introduces certain phrases that are specifically suited to the kind of organization and sector in question. When a candidate does this spontaneously it’s a marker of alignment to a nonprofit’s operating environment.  When a computer does this for the candidate, it’s just fake.

This whole approach advances the pretence that the cover letter comes from the candidate and reflects their thinking and sensibility. When conducting a search, we intensely cross examine candidates and compare information and language used during deep-dive interviews with language in written material. It’s meticulous work. Most other readers will never know that materials are in fact computer-generated… that is until the candidate is hired and must think on their feet and write for themselves. This is one reason we’re seeing an increased number of candidates with very short career stints.

How this affects hiring and a job search

We’re quickly getting to the point where we should all be intensely skeptical of written content and computer generated information. We just can’t be certain that the information is true or complete. Sometimes the boards and CEOs of our nonprofit clients will ask finalists to provide written work. It might be on a question of strategy, a discussion paper on a particular topic, an outline of a workflow. Our past assumption was that the written material reflects a candidate’s knowledge, their ability to communicate and some of their style. This assumption does not hold today. It is becoming impossible to determine the extent to which such work is authored by AI or human. There is just no way to be sure.

There is another, equally important point: if candidate pools are sourced through AI platforms, we can’t know that we’re seeing the best candidates. We see what platforms want us to see.  We also can’t know what information is being collected about our interests, and how that intelligence on our behaviors is being sold and monetized. It’s all a black box.

Distrust and skepticism are rational responses.

In Part 2 of this article I discussed approaches to get to the heart of how to determine whether candidates have the skills, knowledge, style and track-record indicated by their materials. I won’t repeat all of the points here, but suffice it to say that it involves a lot of detailed person-to-person interaction, deep dives into material, and a series of interviews for each candidate.

The real question we must ask is this:

If hiring is based on AI-created information, what are we getting once that leader is in the executive’s chair?

__

To receive a call to discuss these and other nonprofit leadership matters with the author, email Mark Oppenheim at marko@moppenheim.com.

mOp-Ed, Recruiting, Technology
active-searches, mOp-Ed