We’re seeing more AI-generated résumés and job descriptions than ever and, frankly, it’s not pretty.
Used thoughtfully, AI can be a smart and efficient starting point. It can help generate a first draft or polish specific sections. But when it’s used too heavily, or without a human hand guiding it, it creates real risk. Strip out the human element, and you’re often left with documents that are inflated, inaccurate, or buried in a vague word salad of corporate-speak.
For both sides of the hiring table, the stakes of an AI-only approach are high:
When the Algorithm Goes Rogue
Recently, we reviewed a job description for a company hiring its first in-house lawyer. The opening was strong and well-defined. But the next section directly contradicted the goals the company had discussed with us. By the third paragraph, the description had turned into a wall of keyword-stuffed fluff. It was clear AI had been left to fill in the blanks without a strategic eye guiding the outcome.
We saw a similar issue from the candidate side just last week. A résumé was packed with generic language that felt disconnected from the candidate’s actual career path. When we asked about it, she admitted she had used AI to “optimize” for automated screeners. It may have helped her get past an algorithm, but it fell flat with us the humans responsible for helping shape her next career move.
The Pye Perspective
AI is a powerful tool, but it still needs your brain and your specific story to finish the job.
Whether you are drafting a job description to build a new team or updating your resume for a move into the in-house market, remember to:
AI can help you start, but only you can make it real.