Saturday, October 31, 2009

New Job Simulations Report

The U.S. Merit Systems Protections Board (MSPB) just released a great, easily digestible, report on job simulations.

The report includes several things, such as:

- Job simulations defined and advantages/disadvantages

- Types of job simulations (SJT, work samples, etc.) and concrete examples

- Benchmark data on satisfaction with candidate quality as well as how federal agencies currently use simulations (just don't look at GPA compared to job knowledge tests in Figure 2)

- Survey data on why simulations aren't used more often in the federal government (time and expertise, sadly, were the top reasons)

- A 5-step strategy for using job simulations

- References to support the use of simulations (and good selection in general)

A great, free resource for anyone wanting to learn more about one of the best selection mechanisms you can use. And particularly relevant as more and more organizations move to using training and experience (T&E) questionnaires as their first (quick but not particularly valid) hurdle.

Thursday, October 29, 2009

Personality tests: Situation matters


Is a personality test the right selection mechanism for your needs? In trying to answer that question, an important consideration is: Does the job allow for the expression of personality facets?

There's a concept in psychology called situation strength. It refers to how the "strength" of a situation impacts the display of personality via behavior, and it's something important to remember when using measures of personality to predict job performance.

A situation's "strength" refers to the environment under which personality aspects are displayed; think of it like a rulebook. If Job A is described perfectly in exacting detail with very little room for deviation ("Place container A over part B..."), does one's personality really matter when it comes to successfully performing the job? Or thought of another way, if the rulebook repeatedly emphasizes using an aspect of personality (e.g., extraversion), how will people without the ability to express that behavior consistently fare?

If you're a software programmer, it probably matters little in the grand scheme of things how extraverted you are; analytical ability is likely much more important. Similarly, if you work in a call center and follow a script, differences in openness to experience probably don't mean much; extraversion is likely more important.

But what about something like conscientiousness--could the strength of a situation impact the relationship between this aspect of personality and job behavior? That's the question that Meyer and colleagues set out to answer, and an answer is published in a recent issue of the Journal of Organizational Behavior.

The way they went about trying to solve this puzzle was to conduct a meta-analysis at the occupation (rather than job) level. What did they find? A few things:

1) Uncorrected correlations between conscientiousness and performance varied widely between .06 and .23.

2) Correlations appear slightly stronger when using overall performance as the criterion rather than task performance (not surprising given previous research).

3) Stronger correlations were found in "weak" occupations.

What does this mean? Well, for one it reinforces the fact that the answer to "Do personality tests work?" varies greatly depending on what the job is and how you measure performance. But perhaps more interestingly, it suggests that at an occupation level we can expect that for jobs that come with built in rules regarding behavior ("strong" occupations), measures of personality aspects such as conscientiousness may not predict performance as well as for jobs with more flexibility ("weak" occupations).

So the next time you're thinking about using a personality inventory for selection purposes, consider: To what extent will incumbents be allowed to express their personality?

Monday, October 19, 2009

Is recruiting using SNS discriminatory?

I keep reading/hearing about how recruiting using social networking sites (SNS) opens employers up to discrimination lawsuits because of who uses the sites. For the most part, this just plain isn't true.

A recent Pew study is the latest to show that when it comes to using SNS like Facebook, MySpace, and LinkedIn, you really should have one primary demographic concern when it comes to ensuring a diverse candidate pool: age.

Not gender, at least not in traditional sense. While four years ago SNS users tilted slightly male (55%), the balance has essentially flipped today (54% female).

Not race, there simply do not appear to be generalizable differences in racial groups when it comes to these sites (in fact I've seen some data that suggest the user base on these sites is more diverse)--but things change, and this may vary with particular sites, so keep an eye on this one.

But when it comes to age, SNS users are disproportionately younger than the overall Internet population. In the words of the Pew report, "[this] doesn't mean that more older adults aren't flocking to SNS--they are--but younger adults are ALSO flocking to the sites, so the overall representation of the age cohorts in the SNS user population has actually gotten younger."

One demographic difference I don't see a whole lot about: disability status. Are individuals with disabilities more/less likely to use SNS? I think that's an important question we need to address if we're truly trying to diversity our candidate pools.

Tuesday, October 13, 2009

Myths about assessment


Despite plenty of evidence and commentary otherwise, several myths persist about personnel assessment:

1) Some tests are "objective", others are "subjective." This is a myth reinforced by no less than the U.S. Supreme Court on a regular basis. The reality is even the choice to use certain selection methods is a judgment. Sure, certain methods involve more ongoing judgment, but a multiple-choice test can be highly "subjective" and an interview highly "objective" depending on how they are made and used.

2) Only certain selection methods are legally considered "tests" and therefore vulnerable to legal scrutiny. Wrong. Anything you do to narrow down your candidate pool is technically fair game. This includes how you advertise, screen, and interview.

3) Good hiring is an art more than a science. Actually we have decades of research showing the opposite. Human judgment is full of flaws. Combine this with the fact that most people think they are experts, and you have a perfect storm of personal overconfidence. The time and effort spent creating standardized instruments targeting competencies relevant for a particular position will be well spent.

4) Even the best assessments can predict only a small fraction of job performance. It's true it's a fraction, but it's not small. Research indicates that close to 40% of the variation in performance can be predicted with assessments. That's nothing to sneeze at when you consider all the other things that impact performance (organizational climate, quality of supervision, reward structures, team composition, role clarity, resources, mood, etc.).

5) Good assessment will solve all your people problems. Yes, this is sort of the flip side of the above. Consultants like to pretend that with the right assessment instrument every person you hire will be the most productive, friendly, team-oriented person ever. The reality is that performance depends not only on what someone brings to the job, but leadership, organizational norms...all that stuff I mentioned in #4.

Do we have all the answers when it comes to hiring the right person? Nope. Is there enough best practice out there so that any hiring supervisor should be able to get the expertise they need to do significantly better than chance? Yep.

Tuesday, October 06, 2009

Latest EEO Insight


EEO Insight is quickly becoming a great resource for anyone interested in issues related broadly to equal employment opportunity. And this isn't just affirmative action plans--it includes anyone interested in recruitment and assessment.

In the latest issue (v1, #3), you'll read about:

- Alternatives to RIFs such as wage freezes and job sharing and the EEO implications

- Analyzing layoff decisions for statistical evidence of adverse impact

- Using multiple regression to detect race and gender differences in compensation

- Ricci in retrospect and lessons learned

- Reaching out to veterans and individuals with disabilities

- Results of the EEO best practices survey and (very good) recommendations

By the way, if you're interested in EEO issues and you're not already reading OFCCP Blog Spot, I highly recommend starting.