Ready Or Not

There’s been a lot of discussion lately about the proliferation of AI. Certainly, the ongoing development of machines and software that can learn and apply its data, speed up problem-solving, and attempt to mimic human behavior aren’t going anywhere, anytime soon. There’s no denying that AI technology utilization is hitting mass acceptance due in part to its rapid growth and daily use in such a wide variety of applications. Though not always to great success.

 

In some ways, AI is a bit like the 800-pound gorilla: It needs to be fed. The more relevant content you feed an AI system to analyze, compare, and define the criteria necessary for its usability, the more accurate and believable the results. Some of the time. Garbage in, garbage out. But it takes more than feeding the system content, it takes ongoing tweaking to make the systems better understand exactly what you want them to extract from all that content.

 

AI may be a great career path for some. Companies in this field are excited by the opportunities for growth, and newcomers to the industry are opening up shop daily.

 

But can you depend on AI in your job search?

 

As of this writing, most reasonably intelligent and astute HR professionals should be able to detect resumes and cover letters that have been created using AI tools such as ChatGBT. But the human ability to recognize the sometimes not-so-subtle differences between documents created using AI and those created by living, breathing humans is diminishing faster every day.

 

Albeit rudimentary AI by today’s standards, resume filtering systems have long been used to help employers determine which applicants most closely match a defined set of criteria needed to fill specific positions. Submitted resumes and cover letters were electronically scanned for a particular set of key words and phrases related to the jobs they are trying to fill. These systems were mostly used by large companies because of the number and variety of positions they were trying to fill and the time that could be saved over staff doing the same tasks manually. Today, AI systems can sift through thousands of submissions comparing them against entire job descriptions and other details the employer deems important in their hiring. It remains questionable, however, whether either system does a great job at the task.

 

For a job seeker to really benefit from the use of AI in preparing their documents, they would need to “train” the system by uploading thousands of quality versions of documents focused on the same or similar positions they are seeking. There is absolutely nothing practical in this approach. The time-suck required to pull this off is quite limiting and the results may or may not best reflect an applicant’s ability to perform the role the employer is looking to fill. A serious job seeker should be able to turn out a respectable resume in less than 2 hours, especially if they are using one of the many trusted non-AI resources on the subject as a guide. And, once you have at least one decent resume from which to work, any additions, changes or deletions for your next submission should be accomplishable in minutes, not hours.

 

Further complicating the reliance of AI for job search is that two employers, though hiring for ostensibly the same type of position, may use very different descriptive language to convey what they want. The level of fine-tuning required to create accurately-reflecting resumes by AI would be a distraction from the actual job search. Job announcements are usually, though not always, the “best” summary of what an employer is looking for and the language that should be used to focus applicant submissions.

 

To illustrate that AI systems are far from perfect, researchers at the University of Washington demonstrated that some AI tools used by employers have shown bias against people with disabilities. Resumes that included mentions of specialized disability training, certifications and awards were frequently down-graded reducing otherwise qualified applicants’ chances of being hired. If AI systems routinely misinterpret or omit valuable information, they should not be used in an employer’s assessment of a candidate’s worthiness.

 

There have also been reports that AI chat bots have been used to interview prospective hires. In such a scenario there is no opportunity for the human interpretation of vocal inflection, eye contact or the reading of body language (yet, but that will happen eventually!). Human interaction is a necessity for an employer to thoroughly discover the value of an applicant. There are still too many variables that AI systems can’t understand to allow them to be the last word in applicant hiring.

 

Admittedly, I’m not crazy about the trend toward machines being the selector of who gets to fill what jobs. There is so much more to every candidate than what’s in their documents. No matter how strong their work history, how good their skills, the quality or prestige of their education, no job seeker should be assessed by what’s on paper alone. Nor should that assessment be made by a machine that can’t interpret the content and context of what’s on paper and the value of the human together.

 

While the evidence is clear that AI may not be fully ready for full-time use by either employers or job seekers, its inroads over the last few years are great and gaining traction. The next few years and the further development of AI tools will certainly put them in the hands of more job seekers and the HR departments that will become dependent upon them for candidate selection. Ready or not, here they come!

 

#      #      #

 

Leave Comment