Make Job Posts Shine
THE PROBLEM
HOW AI HELPS
of business leaders are dissatisfied with the speed of their company’s hiring process
Hiring requires around-the-clock paperwork for recruiters, and crafting text from scratch is tedious—even more so if you want stellar talent to take notice. Plus, content that misses the mark might exacerbate inefficiencies in a hiring pipeline: A too-generic job post could invite a slew of unqualified applications, while a poorly written one risks disenchanting prospective candidates and drawing out the timeline.
Source: Harvard Business Review Analytic Services
Wherever there’s a text box, AI can aid hiring teams by producing a first draft in an instant. “It solves the blank-sheet-of-paper problem by helping you brainstorm,” says Stross. Take a new job post: AI saves the recruiter time by ideating skills or personality traits best suited to an open role, perhaps proposing edits to make the description more enticing to target applicants or even preparing relevant interview questions for later on.
Greenhouse’s software suite includes similar content generation functionality: With its Sourcing Automation tool, for example, HR professionals can ask AI to help draft an engaging job description or finely tailor outreach emails to specific roles, candidate subsets or business goals—all in a consistent brand voice.
“While at first it may seem counterintuitive, AI can actually reduce bias in hiring,” says Henry Tsai, Greenhouse’s chief product officer. AI’s advanced natural language processing provides the deeper context required to accurately interpret each resume, ensuring job seekers are judged equitably rather than being unfairly excluded due to unusual layouts or industry-specific jargon.
“Greenhouse AI-powered data analysis captures the candidates’ intent behind the resume terminology, leading to higher categorization accuracy across similar roles and harmonizing job skills across resumes,” Tsai explains. For a more standardized and inclusive evaluation cycle, Greenhouse’s AI pulls from a large language model (LLM) to categorize and distill an avalanche of applicant data—regardless of resume formatting, language or complexity—and their Resume Anonymization feature promotes more uniform comparison of candidates by obfuscating specific personal details that could inadvertently influence recruiter judgment.
HOW AI HELPS
Source: Harvard Business School
of employers believe their resume-parsing system screens out highly qualified candidates
Recruiters use “resume-parsing” software to automate the scanning and sorting of applications. But conventional parsing tools rely on outdated recognition technology that can’t account for data nuance and formatting discrepancies to effectively structure and understand inbound raw data. This makes it difficult to compare people fairly and risks overlooking applications: The software might reject an ace resume that lists an obscure college, for instance, or one that lacks the specific job-related keywords it’s programmed to recognize.
THE PROBLEM
Strip Resumes Of Bias-Inducing Details
Swap a manual HR workflow for an AI-assisted environment and a hiring manager can prompt AI to tackle basic administrative duties, perhaps identifying open time slots across multiple calendars and time zones for interviews and sending invites to applicants in a flash.
Greenhouse’s AI product suite includes a similar scheduling feature and offers other automation capabilities to reduce the manual burden on recruiters. Its Report Assistant tool, for example, gleans intelligent patterns from the recruiting data an organization houses on the Greenhouse hiring platform and generates on-demand reports to answer stakeholder questions, forecast hiring needs and analyze recruiting performance.
HOW AI HELPS
Source: Harvard Business Review Research
of leaders believe optimizing hiring with automation and AI is key to long-term business success
A recruiter’s daily grind includes an assortment of repetitive tasks, which constrains space for “the more strategic aspects of their role that require their expertise and interpersonal skills and ultimately drive impactful cross-functional alignment,” says Ariana Moon, Greenhouse’s vice-president of talent planning and acquisition. Scheduling a series of candidate interviews, for example, can feel like an impossible Tetris grid requiring hours of tricky calendar negotiations.
THE PROBLEM
Solve Scheduling Puzzles In Seconds
“In recruiting, AI should be viewed as an assistant that works alongside human intervention and decision-making, rather than replaces it,” Moon says. “The nuanced assessment of interpersonal dynamics, cultural alignment and a candidate’s ability to integrate into an existing team requires a hiring team’s expertise, oversight and judgment.”
Stross explains Greenhouse might use “some magic under the hood” to help employers determine how to narrow down an applicant pool—by brainstorming 10 ways to filter 500 applications, for instance. “AI might look at a job description and say, ‘It looks like you care most about these three things; do you want to use those as filters?’ But we’re not going to have a secret algorithm making judgments about people,” he says. Recruiters should remember their AI assistant is still in training: Which candidates to pick ultimately rests on the human’s shoulders.
HOW AI HELPS
Ariana Moon
AI shouldn’t be used as a one-stop shop for making hiring decisions, nor should it completely remove human intervention and interaction.”
AI unlocks once-in-a-generation possibilities in hiring, but it’s not a panacea. “AI algorithms are only as good as the data they’re trained on,” Moon says. “If data is incomplete, biased or incorrect, AI’s recommendations could perpetuate and amplify existing bias and misinformation.” For ethical AI to thrive, it requires intentional implementation that leaves room for humans to oversee end-to-end hiring practices.
THE PROBLEM
AI Plays Sidekick While You Call The Shots
AI connects the dots for recruiters and acts as a sidekick by summarizing stacks of data throughout an applicant’s journey. Interviewers can tap into AI-powered transcription to focus less on taking notes and more on being present with interviewees and absorbing their responses. Post-interview, a hiring team could ask AI for transcript summaries and objective candidate feedback.
AI’s ability to quickly synthesize conversations not only improves candidate experience but fairness too. Tsai says Greenhouse’s software will eventually leverage AI’s sentiment analysis to flag subjective statements or problematic language during interviews or in internal debriefs about applicants, for example—like a recruiter judging someone based on ethnicity or gender. “We’re exploring the various ways summarization of interview feedback can assist interviewers in conducting effective roundups,” Moon says, “whether by removing recency bias, surfacing feedback gaps or misalignment or making recommendations on offer packages.”
HOW AI HELPS
Source: myHRfuture
of HR professionals cited time constraints as a reason for not using more data in their decision-making
Data is the centerpiece of smart hiring, but it needs to be aggregated first. Even a single hiring cycle generates a vast amount of information for businesses to collate and analyze. While recruiting teams have countless data signals at their fingertips, rarely do they have the bandwidth to draw actionable insights from them.
THE PROBLEM
Let AI Take Notes So You Can Focus
Vice-President, Talent Planning & Acquisition, Greenhouse Software
Treat candidates like customers
“There’s an opportunity in this moment for companies,” says Stross. “If you treat candidates fairly, if you’re transparent and you don’t ghost them, you’re prepared and ask good questions, you’re considerate, ask how to pronounce names, what their pronouns are … you stand out.”
To mitigate traditional hiring’s flaws with AI, organizations need to get concrete about talent goals. Form a rubric for what you’re looking for, advises Stross. “What are the important skills, personality traits, experiences, credentials and values? Create a clear interview plan to assess that” and put each candidate through the same journey, he says. “The way bias slips in is by running an unstructured process.”
Solve With Structure
AI’s evolving tech calls for adaptable policies. Be prepared to iterate alongside AI legislation and continually monitor and adjust AI models, says Moon. “At Greenhouse this means regularly auditing the performance of our AI tools and assessing impact on outcomes like hiring metrics and candidate experience.” Stross also recommends collaborating with legal and security teams to define guidelines that “unleash innovation in a safe way and create a culture of experimentation.”
Observe And Flex
“One of the most foundational ways an employer can uphold a thoughtful candidate experience is by being transparent about how they are and aren’t incorporating AI within their hiring processes,” says Moon. Demonstrate ethical AI principles with “explainable reasons for how you’re narrowing the candidate pool,” adds Stross, pointing to Greenhouse’s own guidelines as an example of how companies can be crystal clear about their practices.
Explain Your Approach