Wednesday, May 30, 2007
It's one of those business books that's both entertaining and enlightening. As you know, there is a LOT of "fluff" out there--this is not one of those books. The authors are all about using good data and experimentation to discover what really works, not just what sounds good or what someone else recommended.
Case in point: the "War for Talent." This phrase gained popularity in the late 90's through several employees of McKinsey Consulting and their book of the same name. Those authors argued that the best performing companies had a deep commitment (obsession?) with finding and promoting talented individuals and offered data that claimed to support a link between this mindset and firm performance. But as Pfeffer and Sutton point out, a closer look at the data raises some eyebrows. Specifically, talent management practices were measured AFTER performance measures, resulting in a classic case of correlation-causation confusion.
Certainly Pfeffer and Sutton aren't the only ones to raise concerns about a talent obsession, but they do so in a very accessible and thorough manner. They highlight three poor decisions practices that apply to talent management (as well as many other issues):
- Casual benchmarking (for example, the failure of "Shuttle by United" to copy Southwest Airline's success or U.S. automotive companies attempting to copy Toyota's success). We see this in our field when folks want to know "how other people are recruiting" or "what test everyone else is using." Good information to know, but look before you leap.
- Doing what (seems to have) worked in the past (for example, using incentive pay in your new organization because it seems to have worked at another one). The best example of this is interview questions by managers who just know their questions about favorite books and who they'd want on a deserted island work--even though they don't have any data to support their view. In my experience about 20% of managers are good interviewers (and I place a lot of the blame on HR).
- Following deeply held yet unexamined ideologies (for example, equity incentives, the so-called "first-mover advantage", and merit pay for teachers). In our area this includes things like believing applicant tracking systems always result in improvement, or that integrity tests are more discriminatory than other types of tests.
So how do we apply these lessons to recruitment and assessment? Here are just a few ways:
1. Be a critical thinker. We know we're supposed to eye HR metrics with some skepticism, but do we? Do we adopt "best practices" without thinking about how our organization might differ in important ways? Are we lured by shiny new pieces of technology without asking ourselves whether we might be better off without it? On the flip side, do we resist new ways of doing things without even considering the possibilities?
2. Know the evidence. HR is not guess work--we know a lot about what works and what doesn't. Every HR practitioner and manager should read Rynes, Brown, and Colbert's "Seven common misperceptions about human resource practices", with a more detailed analysis here.
3. Push back when you hear something that sounds too simple or too good to be true--it probably is. Two examples: behavioral interviewing does not solve all of our assessment problems, and social networking sites will not solve all our recruiting problems.
4. Model evidence-based decision making. Make it clear that you are making decisions based on the best data you could find/gather and that this is an expectation for everyone. Rather than rushing into a decision, take the extra time to gather whatever information you get your hands on--as long as it doesn't lead to paralysis by analysis.
5. Do experiments whenever possible. Include an assessment instrument as a research-only tool and see if it predicts performance. Try out different advertising methods and mediums and track applicant numbers and quality. Did you know Yahoo! typically runs about 20 experiments at any time, changing things like colors and the location of text? We can't all be Yahoo!, but we can all be experimenters.
Some of my favorite quotes from the book...
"If doctors practiced medicine the way many companies practice management, there would be far more sick and dead patients, and many more doctors would be in jail."
"The fundamental problem is that few companies, in their urge to copy--an urge often stimulated by consultants who, much as bees spread pollen across flowers, take ideas from one place to the next--ever ask the basic question of why something might enhance performance."
"Instead of being interested in what is new, we ought to be interested in what is true."
"There is really only one way around this reluctance to confront the hard facts, and that is to consciously and systematically understand the psychological propensity to want to both deliver and hear good news and to actively work against it."
Tuesday, May 29, 2007
First post here.
Second post here.
(Bolded date may indicate 2006 but these are 2007 posts)
Highly recommended for employers covered by OFCCP laws/regulations.
Monday, May 28, 2007
What is it? It's a free browser plug-in/toolbar (for IE or Firefox) that serves several purposes. One is simply as a quicker way to access LinkedIn content. But the much cooler feature is that when you're looking at jobs on Monster, CareerBuilder, HotJobs, Craigslist, SimplyHired, Dice, or Vault, a separate window comes up that notifies you if anyone in your LinkedIn network works for the organization and allows you to contact them to help with making the right connections.
Let's look at an example. I went to SimplyHired and looked up jobs working for Apple in Sacramento, CA. When I click on any of the jobs that come up the JobsInsider window pops up and tells me 207 people in my LinkedIn network work for Apple, and two are friends of my connections. I can click on the link and it takes me to a description of those people. Click on any one person and it tells you how you're linked to that person. Here's what it looks like:
Not only that, but (at least with SimplyHired), when job search results come up, you can click on "who do I know?" for each position to have LinkedIn search your network.
Pretty nifty, huh? So why do we care, other than it being a nifty little piece of technology?
For one, it's another reason to be a member of LinkedIn--at least if you're interested in being contacted by applicants. Given the choice between pursuing a job somewhere where I don't know anyone and a place where I can make a contact, I'll take the latter.
Second, it's a good way to double- (or triple-) check credentials of applicants. Most of these networking sites strongly encourage you to put in your educational background and job experience. If what's listed here doesn't match the resume or application they submitted to you, that's something to follow up on. Could be a simple explanation, could not be.
Finally, another reason to care about this is it's likely a sign of things to come. With meta-people search sites like ZoomInfo out there, and ones like Spock coming on board, we need to be very comfortable with our on-line identities and understand how they link to other people.
One last cool feature of the LinkedIn toolbar. When you open an e-mail in Gmail, Yahoo! Mail, MSN Hotmail, or AOL, you automatically have the option to get someone's LinkedIn information or invite them to your network. In fact there's even a tool that will do the same thing for your Outlook mail.
Happy Memorial Day!
Friday, May 25, 2007
There are so many recruiting blogs that there are lists devoted to keeping track of them, and new ones are announced practically every day.
There are so few blogs focused on assessment that I have a hard time thinking of them. The only 100% "true" blog (besides this one) is Jamie Madigan's SelectionMatters--and he just started back up again (thank goodness) after a long hiatus. Okay, Dr. Michael Mercer has one but it doesn't allow comments. Yes, Michael Harris does a great job with EASI-HR Blog, but assessment is just one of the topics he covers. Ditto for Alice Snell's Taleo Blog. Charles Handler has the awesome Rocket-Hire newsletter, but that's periodic and doesn't allow for (as far as I know) reader comments.
Then there's PIOP, a message board that's been around for a while, but it's focused primarily around students. Perhaps the recently established iocareers.com will go somewhere but it too seems focused on students--which isn't a bad thing, but it limits the reach.
So what's up? Why the difference? I have some theories:
1. Recruiters like to hear/read themselves. Yes, that's undoubtedly true. But do you know any assessment professionals that DON'T?
2. Recruiters use them to market themselves. I think this is a pretty compelling possibility. Still, if true, why aren't more assessment professionals? There are plenty of assessment consulting outfits.
3. Recruiters are earlier adopters of technology. Assessment professionals are, IMHO, a conservative bunch when it comes to technology adoption. They've seen things come and go, and don't want to jump wholeheartedly into a possible flash-in-the pan. Prime example: SIOP's excellent news articles are not syndicated.
4. Recruiters are more focused on the business side of things, assessment folk more with research. This seems likely of I/O Ph.D.'ers, but what about the rest of the assessment community? Are we still stuck trying to apply law and rules rather than creating networks and sharing best practices?
5. Assessment folks don't know about the blogosphere. I think there is some truth to this, as well. As a group, assessment professionals just don't seem to have taken to electronic forms of information sharing (the IPMAAC listserv being a notable exception), so I don't see why blogs would be any different. They seem to prefer conferences.
6. Assessment folks don't see the value of blogs. And some in the recruiting field don't either.
7. Recruiters are more extroverted than assessment types. I have absolutely no evidence to support this claim, which hasn't stopped me in the past and won't stop me this time.
Is it just me or is something (not) going on here?
Thursday, May 24, 2007
According to this post, changes include:
1 - Simplified search. It's less technical now and easier to find what you're looking for.
2 - Enhanced results--searches now includes posts, blogs, photos, videos, podcasts, etc.
3 - Improved user interface (and it's still being tweaked), including a ticker at the top that tracks popular searches.
Why do we care about blog search engines?
- They're a great way to keep up on the latest news, research, articles, etc. in your field of interest. Blogs are oftentimes updated more frequently than other forms of media and are very targeted.
- They're a good way to identify potential applicants--people who are passionate about their interests and keep up to date on developments.
- For those of you considering starting a blog, or in differentiating your blog from others out there, they can be used to canvass the existing landscape and make comparisons.
- They can be used to see what's popular--blogs and searches.
- They can, sometimes, be used to gather information about applicants you're considering for hire. You may be able to find work samples or opinions/thoughts expressed by people you're considering--and you can use that information to follow up/verify in person.
- They're a great way to make connections and become part of the larger community.
Oh, and don't forget about other blog search engines (some people have a preference), such as Google Blog Search.
Wednesday, May 23, 2007
Employment interview structure and discrimination litigation verdicts: A quantitative review
Pool, McEntee, and Gomez analyzed 31 federal court cases from 1990 to 2005 (27 claims of disparate treatment, 7 of adverse impact) to see if there was a relationship between the amount of interview structure and verdicts in employment discrimination cases. Most cases (73%) were brought under Title VII and involved promotional decisions (65%). Race discrimination was the most common allegation (47%) and the vast majority of cases (84%) involved a single plaintiff. For both types of claims, the strongest factors associated with a victory for the defendant (employer side) was having interviewers that were familiar with job requirements and having a guide for conducting the interview. In disparate treatment claims, defendants were more likely to prevail if they also had standardized questions and identical interviews for each applicant. In disparate impact cases, defendants fared better when they had evidence of validity (which makes sense given the burden shifting in these cases). Similar results to Williamson et al.’s 1997 study, but good data to have—see, we’re not just saying standardize those interviews because we’re sadistic HR folks.
Recruiting through the stages: Which recruiting practices predict when?
This meta-analysis by Uggerslev and Fassina of 101 studies looked at the impact that various “recruitment predictors” (e.g., job-person fit, job/organizational attraction) had on various outcome criteria (e.g., job pursuit intention, acceptance intentions). Results depended somewhat on the criterion, but perceived fit between the individual and the job/organization was across-the-board the strongest predictor. The only criterion that matched perceived fit was job characteristics, which tied for predicting acceptance intentions. The strength of the correlations varied, from a low of .15 between perceived fit and job choice to .47 between perceived fit and recommendation intentions. So how do we use this? The authors suggest efforts to increase the appearance of a good fit between the values of goals of applicants and those of the organization may pay off (I'm thinking, say, by focusing on aesthetics and message customization or clearly indicating what you’re looking for).
Meta-analysis on the relationship between Big Five and academic success
Okay, so it's not directly about recruitment or assessment, but it's still interesting. The title pretty much says it all--the presenters (Trapmann, Hell, Hirn, and Schuler) were looking here at the relationship between Big Five personality traits and academic success. Results? As you might expect, it depends what you mean by "success." Neuroticism was related to academic satisfaction (hey, that's why they're neurotic, right?) while Conscientiousness correlated with grades and retention. The other three factors (Extraversion, Openness, and Agreeableness) were not related to success.
That's probably the end of my review of 2007 SIOP presentations, unless I manage to obtain more presentations. Stay tuned for reviews from the upcoming IPMAAC conference!
Monday, May 21, 2007
Using several different experiments, the researchers found that even a brief glimpse of the color red can lower scores on achievement tasks. For example, one of the experiments involving nearly 300 U.S. and German high school and undergraduate students found that simply looking at a red participant number (versus black or green) prior to completing an IQ test resulted in a performance decrease.
The authors hypothesize that the color red evokes an anxiety response which in turn interferes with the ability to complete the task. Where does the anxiety come from? Some possibilities, according to the authors, include:
- Evolution: we may be hardwired to respond to red (think of the association between red and aggression in nature)
- Daily life: red is often associated with warnings or commands (e.g., stop lights, stop signs, dash lights)
- School: who didn't cringe a little when they saw red marks on their essays or tests in school? Maybe you even have a supervisor who does this?
Lesson: be careful with using red in testing material. There's enough error out there being introduced in testing situations without worrying about color.
So...did this article make you nervous?
Hat tip: SIOP.org
Thursday, May 17, 2007
Several issues were discussed, including potential problems with specific screening methods (e.g., cognitive ability tests, credit checks), how the EEOC can better serve employers, and steps employers need to take in order to meet professional and legal guidelines (e.g., gathering validity evidence, investigating alternative methods with less adverse impact). Not for the first time, speakers emphasized that the Uniform Guidelines on Employee Selection Procedures need to be updated.
Speakers included EEOC staff members, plaintiffs in two of the more discussed recent cases (EEOC v. Dial Corp. and EEOC v. Ford Motor Co.), attorneys, and professionals in the field of assessment, including James Outtz and Kathleen Lundquist, who have frequently been retained as expert witnesses in employment discrimination cases.
Said Richard Tonowski from the EEOC:
"A mature technology of testing promises readily-available methods that serve as a check against both traditional forms of discrimination as well as the workings of unconscious bias. If that is the promise, then the threat comes from institutionalizing technical problems not yet fully addressed, the undermining of equal employment opportunity under the guise of sound selection practice, and the unintended introduction of new problems that will require resolution to safeguard test-takers and test-users."
Personality testing was mentioned prominently as an increasingly common practice among employers, but it appears (contrary to my earlier fears) that the focus was on those tests that could be considered "medical tests" under the ADA (such as the original MMPI), which leaves out many products, including the HPI, 16PF, and PCI.
Hopefully I'll have the slides from the presentation to post soon. In the meantime, check out this excellent summary from an attendee, and you can view the EEOC press release here. Statements of the speakers, along with their bios, can be found here, and it looks like the meeting transcript will be available there as well.
Wednesday, May 16, 2007
Membership is limited to "individuals who are serious about the field of I/O Psychology" but during the beta of the site, posting jobs is FREE. (After that it's $250 a pop).
Check it out.
Tuesday, May 15, 2007
As part of the appeal of MSPB's decision, the individual claimed that his "guaranteed right to fundamental fairness" was violated when the deciding official Googled his name and came across information regarding his work history (essentially termination) with a previous employer. The individual claimed this unduly influenced the decision to remove him.
The court disagreed. The three-judge panel found that because information regarding the previous job loss did not influence the official's decision to remove the appellant, it did not show prejudice. Additionally, they found no due process violation.
The unanswered question here is what would have happened if the prior work history information HAD influenced the termination decision. And what if it was a hiring or promotion situation rather than a termination where voluminous information already existed regarding bad behavior? It will be interesting to see how this area of law evolves.
The appeals court decision (nonprecedential) is here.
Good string of articles about using the Internet to gather background check information can be found here.
Saturday, May 12, 2007
Data gathered from 47 companies indicated:
- 55% of hires were made from online sources (+8% from last year).
- Employee referrals were the largest single source (21% of hires), followed closely by the organization's website and general job boards.
- Employee referrals also generated the highest quality candidates (82% rated favorable), but niche job boards and search firms tied for second, with campus recruiting a very close third. General job boards were rated favorable by only 22% of respondents.
- The largest percentage, by far, of recruitment/advertising budget went to general job boards (34%). Referrals, the source of the highest quality candidates, received 6% of the budget.
- Putting these numbers together, the source value (cost/hire) was highest, by a large margin, for referrals, followed by the organization's web site and, perhaps surprisingly, social networking technology.
Comments and follow-up conversations indicated a growing frustration with general job boards (especially for IT jobs) as well as a growing reliance on sources of passive candidates, such as social networks, blogs, and search engine optimization.
Read the full report for a much more detailed analysis and insights. Thanks to Rocket-Hire for making this available.
Thursday, May 10, 2007
Wednesday, May 09, 2007
Looking for ways to snazz up your postings?
Then read this post over at jobs2web. Check out the graphic.
How close are your postings and/or career portals to this? Are they even in the same ballpark?
How hard would it be to add things like:
- links to a webinar/job preview video
- RSS feed
- subscribe to similar jobs
Answer: not hard. Let's hurry up and get there!
Tuesday, May 08, 2007
Legal risks and defensibility factors for employee selection procedures
Posthuma, Roehling, and Campion analyzed nearly 600 federal district court cases and came up with some very interesting results:
- Employers are most likely to win (by far) when defending tests of math or mechanical ability. Employers also fare well when defending assessments of employment history and interviews.
- Employers did worst when defending physical ability tests and medical examinations. Tests of verbal ability and job knowledge were also more likely to result in a plaintiff win.
Predicting Internet job search behavior and turnover
Using a sample of 110 nurses in Texas, Posthuma et al. found using longitudinal survey data that (among other things) Internet job search behavior was related to turnover--folks weren't just surfing for fun. This suggests that organizations need to pay close attention to job searching behavior among employees; not necessarily to curtail it but instead to figure out why high performers want to leave.
Gender differences in career choice influences
After analyzing survey data from nearly 1,400 fourth-year medical students from two U.S. schools, Behrend et al. found a gender difference in preferred career: specifically, female medical students valued "opportunities to provide comprehensive care" when choosing a specialty much more than men. This is consistent with other work that has showed women to be more "relationship-oriented" than men when it comes to choosing a career.
Portraying an organization's culture through properties of a recruitment website
In this study of 278 undergraduate students, Kroustalis and Meade found that inclusion of pictures on a website that were intended to portray a certain organizational culture did so--but only for certain cultural characteristics. Specifically, pictures that implied a culture of either innovation or diversity had the intended effect--but pictures representing a team orientation did not. Interestingly, "employee testimonials" designed to emphasize these cultural aspects failed to do so for any of the three aspects studied. Finally, individuals who perceived a greater fit between themselves and the organization (in terms of the three cultural aspects) reported being more attracted to the organization.
Recruiting solutions for adverse impact: Race differences in organizational attraction
Last but definitely not least, Lyon and Newman gathered data from nearly 600 university students on their reactions to 40 hypothetical job postings...and came away with some very interesting results. For example:
- Conscientious individuals were more likely to apply to postings that explicitly stated a preference for conscientious applicants.
- Conscientious individuals were more likely to apply to postings that described the company as results-oriented.
- Black applicants with higher cognitive ability were more likely to respond to ads seeking conscientious individuals while White applicants with higher cognitive ability were less likely to do so.
- When a company was described as innovative, Black applicants high on conscientiousness were more likely to apply; this was not the case for White applicants.
Monday, May 07, 2007
Seem risky? Well it doesn't appear to be hurting them. In fact, the company loses only .4% of revenue to theft, much less than typical for big retailers (1.5%). Says Zimmer:
"I don't trust the U.S. justice system to get it right...I'd rather make my own decisions, and I believe in giving people a second chance."
This policy is particularly interesting given efforts by various jurisdictions to limit criminal history checks in employment screening as well as the EEOC's renewed focus on criminal history checks as part of its new E-RACE Initiative.
Friday, May 04, 2007
"Studies reveal that some employers make selection decisions based on names, arrest and conviction records, employment and personality tests, and credit scores, all of which may disparately impact people of color."
The citations for this sentence include a study on criminal records and one on names.
This triggers a couple questions for me...
1) What is the difference between an "employment" test and a personality test? Is this just redundancy or was it intentional?
2) More importantly, where is the evidence that personality tests have disparate impact? The research I'm familiar with indicates that differences between subgroups are relatively small.
As far as I know, cognitive ability tests are still the biggest 'offender' when it comes to racial differences in test scores (although this can be reduced by focusing on aspects of cognitive ability, such as short-term memory). Seems like this is where the EEOC would want to focus, along with background checks?
Wednesday, May 02, 2007
In an article in this month's Inc. Magazine, another similar type of service is described--Protuo.
Protuo's job matching system is probably the most detailed I've seen, which is good or bad depending upon how you look at it. Good in that the matches that are generated are hypothetically much more likely to be on target. Bad because given most people's attention span, I question how many people will actually fill out the entire profile.
The candidate/job profile has three broad categories:
* Personal skills and knowledge
* Social skills and training
* Business and analytical skills
Within each category is another set of fillable tabs. For example, under "Personal skills and knowledge" there are separate categories for Personal Skills (e.g., creativity, initiative) and Arts, Language, and Science (e.g., math, biology). Individuals can create their own web pages to highlight their qualifications (example here)--with full HTML editing capability and blog included, which are nice touches.
Aside from time considerations, there's another potential problem with drilling so deep. People aren't known for their ability to describe themselves particularly accurately. This is not helped by the fact that people are asked to describe their skill level in general terms that are not tailored to the category and for which no instructions are provided (e.g., Basic, Intermediate, Advanced, Expert). My guess is the match percentage will suffer for it.
On the other hand, these types of systems are still developing. Credit goes to Protuo for combining job search with personal pages. Expect much more in the way of tailored application and assessment processes that try to maximize speed and quality.
Tuesday, May 01, 2007
Last year, while roaming the halls of the 2006 EMA and SHRM conferences, I was surprised by the lack of basic knowledge test and assessment providers were prepared to offer attendees. They were at it again this year. While attending the SHRM Staffing Management Conference in
- Our test is 97% accurate at identifying the right person.
- Our test was validated by the EEOC. It only takes five minutes; you are going to love this.
I was so surprised by this comment that I was not quick enough to ask about their selection project at the EEOC. I regained my composure and asked how they establish job relevance. Their answer was, “Over a million people have taken this.”
When I asked how they help users adhere to The Uniform Guidelines on Employees Selection Procedures, they were clueless. When I asked about their various norm groups and which types of job criteria were predicted, they were clueless. When I asked if they would share the results of their validation analysis, they had no response.
Again, stuffing a questionnaire at me they implored me to “Take it, you’ll love it.” Explaining I was not interested, I wandered away to find another booth offering assessments.
Once again this may be a case of an ill prepared service provider expecting to meet uneducated consumers in the market place. It is actually pretty scary. I asked myself: “What type of staffing professional would actually believe that a five minute adjective checklist would be 97% accurate at identifying the right candidate?” Perhaps someone who was more hopeful than thoughtful.
After settling in with some small talk at my next target, I began my investigation. “How do you help users adhere to The Uniform Guidelines on Employees Selection Procedures?”
I got the most honest and candid answer. “No one has ever asked me that before. Let me tell you about our approach.” After a thoughtful exchange I left and drew a conclusion:
Prepared, but not proactive.
Why are testing and assessment providers NOT proactively taking the initiative to educate their consumers? I drew another conclusion:
Possibly because an educated consumer might not buy their product.
This thought took me back a few years to a time when I walked the SHRM Conference Exhibit Hall with a professional colleague, Bob Eichinger. He is an I/O psychologist, co-founder of Lominger and formerly on staff at PepsiCo and Pillsbury. We stopped to look at a booth with a long line of conference attendees waiting to take a five minute “wonder test.” Bob looked at me with a smile of disbelief, shook his head and said: “Shit sells.”
People and their behavioral tendencies are pretty complex. Predicting job-fit in five minutes is a claim that seems to hold a bit of hyperbole. Caveat emptor - Buyer beware.
The trend in compliance auditing is shifting from examining the content of historical candidate searches to a review of the job relevance of assessments. Auditors want to know about the job analysis process used to establish the content validity of the evaluation method, the date of the last validation analysis and to what degree the job may have changed since the last validation.
Organizations using an assessment that was “Validated by the EEOC” may be in for a big surprise if they are faced with an audit.
Take time to become an educated consumer of assessment products and services. You and your company will be glad you did.
Joseph P. Murphy
Shaker Consulting Group, Inc.