Why do so many managers continue to accept mediocre, second-rate results?
Hundreds of research studies have quantified the difference between having an “A” player versus a “C” player in a job, any job. Every one of them concludes the difference in productivity and the impact on the bottom line is anywhere from 20 percent to over 1000 percent greater return when you compare the best, most productive employees to those who are average.
While I’ve never met anyone who disagrees with this data, most managers and organizations continue to keep “C” players on the payroll. This leads me to believe these managers:
Article Continues Below
- Are focused on the employee, not the expected or required results.
- Do not face any consequence for hiring or retaining “C” players.
- Are rewarded for low turnover rather than for retaining the right people.
- Their employers say they want only “A” players, but they are not committed to hiring and retaining them.
- Don’t know what “A” players look like.
- Don’t know how to recruit “A” players because most of them are not looking for jobs.
- Don’t know how “A” players think and make decisions (which is vastly different than employees who are looking for “just a job, any job”).
- Use a screening process designed to screen people out rather than ensure the right people get in.
- Don’t provide “A” managers to supervise “A” employees.
- Have an HR department that is reactive instead of proactive in helping managers build ‘A” teams. They are risk adverse when it comes to letting people go.
This was originally published on Mel Kleiman’s Humetrics blog.