Friday, February 26, 2010

Book review: Strategy-Driven Talent Management

A thought-provoking collection of essays and ideas; but it won't solve all our problems.

The value of a book lies as much with the reader as it does with the content. A book about advanced programming does little good to the person who has problems turning a computer on. A collection of cooking recipes is largely useless to someone who exclusively uses a microwave.

The same is true about business and HR books. Depending upon who you are and where you're at in life, some books may help you, some may be beyond your reach. Such is the case with SIOP's latest entry into its Professional Practice series, Strategy-Driven Talent Management: A Leadership Imperative, edited by Bob Silzer and Ben Dowell.

The book (tome, actually, at nearly 900 pages) is full of thought-provoking pieces from a variety of authors, including some familiar faces such as John Boudreau and Allan Church. There are academics present, but the majority of authors are practitioners in private sector organizations, such as Aon, Ingersoll Rand, HP, Sara Lee, Merck, and Bristol-Myers Squibb.

The book is roughly broken up by major topic area, although the distinction can be hard to maintain. There are chapters on recruitment, executive onboarding, engagement, measurement, and global issues. There's even a 40-page annotated bibliography. But the editors do an admirable job of keeping the topics all related to the broad field of talent management (TM), which they define as, "an integrated set of processes, programs, and cultural norms in an organization designed and implemented to attract, develop, deploy, and retain talent to achieve strategic objectives and meet future business needs" (p. 18).

The book is described as a "comprehensive [collection of] state-of-the-art ideas, best practices, and guidance." It shines on the first two but failed me on the last, although not for lack of trying. The problem is the book is so long, full of so many ideas and case studies, that it's very easy to get lost and not come away with any clear guidance based on the consensus of authors. To some extent this is endemic in any collection of works by separate authors, but it's clearly a collection of "what you might do" rather than a solid prescription for "how to", although some authors do a better job than others.

Another problem is that many authors seem to presume that current TM practices are sub-optimized because they aren't linked to business strategy and results. This may be true if the process is based on non-validated assumptions, but as long as there is a link between job success and specific practices, we're already there. We just haven't made a particularly good link between job success and organizational success, which may explain the attraction to concepts like competencies (mentioned many times in the book).

But my main problem, and this goes for the book as well as the field, is that it treats the concept of talent management as a logical process to be managed. Somewhere in the transition from HR to TM, we lost the H--human. Talent management (and HR) is messy because it involves people. It's political. It changes every day. And you're dealing with emotions, not lines of code. The real challenge--which is discussed but to my mind not driven home--is how to get the talent mindset into the organizational DNA.

There is value to thinking broadly and philosophically about the topic. It helps us plan. But what people really need are concrete suggestions for establishing a self-sustaining high-performance system. In order to do this, we must address the fundamentals (the basic needs of Maslow's hierarchy, if you will), such as:

- HR must learn "the business" and stay close to their customers
- Supervisors must be selected and trained with their talent management role at the forefront
- Success in HR must be defined and measured. It must be communicated, understood, and valued
- Sustained attention to HR success and significant resources must be expended by both HR and line managers

The book does a passable job of presenting these, but you may have to dig for them. The bigger problem is that there seems to be an assumption that what keeps organizations from having a top-notch TM system is a lack of understanding, either of the organizational strategy or best practices in TM, rather than the very real daily troubles that organizations experience, such as:

- Supervisors that hire people they know/like rather than the most qualified person
- People placed into HR with little or no background, interest, or passion for it
- Insufficient resources devoted to TM/HR
- HR managers who are just that--managers--rather than real HR leaders (Avedon and Scholes present a great assessment in Chapter 2 that helps separate these)

Until organizations have these types of "minor"--but real--flaws ironed out, all the charts and good intentions in the world will have very little impact.

Finally, I was also disappointed that there wasn't more in here about evidence-based TM and HR (which may say more about the field than the authors/editors, who acknowledge this lack in Chapter 22). The field desperately needs more research to tie the hard science of assessment with the more anecdotal/consultant practices such as recruiting, retention, and performance management. This will require significantly more research using methods beyond surveys in order to show what works and what doesn't. There are some ties to good research in here, but the hole is significant.

To summarize, the book contains a lot to like, particularly for individuals already schooled in this area looking to optimize their shop, or for graduate students seeking to understand the big picture. But for most HR practitioners (and, I would expect, executives), this book is akin to a collection of recipes for advanced Italian cooking--fabulous for those used to making their own pasta, but beyond the reach of those struggling to make their own sauce.

Thursday, February 25, 2010

Webinar on 21st Century Assessment

Went to a pretty darn good webinar yesterday put on by HCI and featuring Ken Lahti (PreVisor) and Charles Handler (Rocket-Hire). The topic was 21st century assessment.

Some of the topics covered included:

- increased functionality and usability of testing platforms

- increased sophistication of security methods

- off-the-shelf tests and "I/O psychologists in a box"

- integrating assessment with your overall talent strategy

And my two favorites:

- advanced simulations (such as those using video game technology)

- candidate data that follows them

The webinar is going to be re-broadcast several times today and tomorrow, if you have a chance check it out. You can also see a copy of the slides for free if you're an HCI member (which is free).

Free, short, and full of information--that's my kind of training.

Saturday, February 20, 2010

Recruitment v. Assessment, Round 1: Fight!

I'm sure some of you are avid readers of ERE (Electronic Recruiters Exchange), but for those of you that aren't (and don't receive my shared items), there's a lively discussion going on over there regarding the practical value of recruitment versus assessment practices.

It started with Wendell Williams' first post on how to identify a bad test (the second part is also worth reading). The comments begin relatively benignly, debating the strength of various predictors of performance (e.g., P-O fit versus behavioral interviews), but turns into, let's say, a lively debate that includes a discussion of the Gallup 12, the limitation of assessments, the Uniform Guidelines, and a lot more. The most heated exchange occurs between Wendell and Lou Adler, where accusations and sarcasm fly.

Speaking of Lou, he continues to advocate his perspective with his next post on whether increasing interview accuracy increases quality of hire (yes, he's suggesting that's an open question). While the comments following are fewer in number, the debate continues regarding the value of assessment and the evidence used to support it (e.g., Schmidt & Hunter's 1998 piece).

Who said HR is boring?

Saturday, February 13, 2010

Latest IJSA: Emotional intelligence, multiple-choice formats, and lots more

The March 2010 issue of the International Journal of Selection and Assessment (IJSA) is out, and the research covers a wide variety of recruitment and assessment topics as well as being truly international:

Unproctored internet-based testing (UIT) response distortion may be less than we fear (sample included cognitive and personality measures)

What factors are most important to organizations when choosing a test? This study suggests applicant reaction, cost, and diffusion of the test type in the field.

Personality (esp. core self evaluation) is related to the type of work preferred, and hence P-O fit

Career site features may differentially attract men and women

Corporate images do matter when it comes to organizational attractiveness

Who uses job-search websites and how to improve them (the sites, not the people)

Support for performance-based (as opposed to self-report) measures of emotional intelligence

Work samples, interviews, and ability tests perceived best by employees (why? because they work, say the participants)

...and last but definitely not least:

A "2 of 5" multiple-choice format seems superior than traditional "1 of 6" (you just have to make sure you can score them that way!)

Sunday, February 07, 2010

Feds new jobs site is Googlish

The U.S. Government has revamped its jobs page,, and in the process shown everyone else how its done.

Take a look at their old site. Not horrible, but cluttered with lots of features that distracted from the main reason people visit the site: to look for a job.

This picture actually doesn't do it (in)justice; there was additional content below the bar.

Now look at their new website:

This new website is what I would call "Googlish": simple, lots of white space, no scrolling required, and a single search box. The design focuses less on being pretty, and more on being functional. If you're interested in learning more about careers, or if you'd like information related to specific groups, like veterans or those with disabilities, its still there. And there's even more functionality up top in the form of drop-down menus.

Job seekers don't need a magazine ad. They need to quickly and easily find information. And this new website fits the bill.

How does yours compare?

Thursday, February 04, 2010

Lessons from NYC Fire case - part 2

Part 2 of 2

Last time I discussed five important lessons we can take away from recent rulings in the Vulcan v. City of New York case. In this post I'll review the remaining lessons and also discuss the relief order.


6) The city failed to provide sufficient evidence that the exam(s) tested for a sufficient number of the critical KSAs. They also failed to explain why they chose not to measure several KSAs identified as critical.

Lesson: the courts do not require employers to measure every single critical KSA. But there is an expectation that employers attempt to measure a sufficient number that represent a significant portion of the job requirements. In this case, that included non-cognitive abilities such as resistance to stress, teamwork, and conscientiousness, that were not measured.

7) The city failed to adequately consider how to measure a significant number of essential KSAs. While some of their concerns were valid (e.g., structured interviews for all applicants would be an operational nightmare), there are many different forms of testing that should have been considered, including situational judgment tests (SJTs) and biodata, which can be used to measure non-cognitive components.

Lesson: triers of fact expect employers to be up on the various assessment methods available and be able to explain why they chose not to use certain ones. This includes tests that are relatively easy to develop (e.g., SJTs) as well as ones that require substantial resources and statistical expertise (e.g., biodata).

8) The city failed to conduct a reading level analysis on the exams to ensure that it was not "pointlessly high." The plaintiff introduced evidence suggesting the reading level was above 12th grade; in addition, it appeared to exceed the reading level of materials at the academy.

Lesson: never forget that every assessment method is in some sense measuring additional KSAs beyond those you intend. For written exams, reading comprehension is always a requirement (barring accommodation). It's quite easy to conduct a reading level analysis (MS Word has it built in) to ensure that the level is reasonable and matches other job-related material.

9) The city failed to show that the cutoff scores (pass points) established for the exams were based on adequate rationale, namely "the necessary qualifications for the job of entry-level firefighter." Instead, the cutoff scores were based on operational need (the number of job openings expected). This is particularly important in multiple-hurdle selection processes such as in this case, where a failure on one exam component precludes an applicant from participating in the rest of the (potentially compensatory) assessment process.

Lesson: ultimately applicants have to pass the test(s) to be considered for employment. Cutoff scores should be established using the expertise of both SMEs and test developers and should be based on the minimum competency levels required upon entry to the job. At a minimum (and I would not rely solely upon this), the scores should be analyzed to identify any logical "break-points."


After ruling for the plaintiffs on both the adverse impact and disparate treatment claims, the judge issued a relief order on 1/10/10. In it, he imposes several things, including the following:

1) The city must develop a new testing procedure for entry-level firefighter in conjunction with the relevant parties. Following the development of the test, there will be a hearing to determine if this test should be used rather than the current test (developed in 2007 and not at issue in this litigation).

2) The court shall develop a process by which the approximately 7,400 applicants covered by this case can file a claim for monetary relief.

3) The city will identify 293 black candidates on the eligibility list and offer them priority hiring. (No quotas are being imposed, although the judge leaves this possibility open)

4) Retroactive seniority for those hired.

In addition, several other issues are up for debate, including the appointment of a special master or monitor, standards that will be relied upon in constructing the new exam, and the need for additional relief.


So what did we learn from all this? If you follow--fairly closely--best practices when developing and administering exams, you will be on solid ground defending them. If you don't, and your exam has a discriminatory effect, you may be called on it--and it's not a pleasant process. I'll leave you with this quote from the January ruling on disparate treatment:

"The history of the City's efforts to remedy its discriminatory firefighter hiring policies can be summarized as follows: 34 years of intransigence and deliberate indifference, bookeneded by identical judicial declarations that the City's hiring policies are illegal."