In the Rush to Big Data, Don’t Ignore the Legal Risks

Dear Littler: We are revamping our online job application. I asked our HR director if we should eliminate the question asking about an applicant’s hobbies. Not only does she think we should keep the question, but she says we need to gather more “big data” about our candidates. What the heck is “big data?” Are there any risks with relying on “extra” information about candidates?

Luddite in Lincoln

Dear Luddite in Lincoln,

“Big data” generally refers to the massive amount of data about individuals that organizations have the emerging ability to gather, analyze, and evaluate to help inform their decision-making. We realize that concept probably sounds like gibberish, but don’t panic! Let’s break it down.

Wrapping your head around big data

These days, nearly every individual creates a significant digital footprint in the course of their daily lives. This digital footprint is created by our interactions with various aspects of the digital world, including logging into and using work related systems, playing games, buying things, selecting apps, using search engines to find information, participating on social networks, etc. Depending on the particular website or app, and the individual’s settings, much of this activity is monitored, in part so that the various operations can work optimally. For example, a search engine might remember your last few searches, on the theory that you may return to them and find them faster if it remembers your terms.

Of course, much of this information is used for commercial purposes. A search engine knows your interests and curiosities based on your browsing activity. It can sell advertising space to businesses that can target you specifically based on the information that the search engine has gleaned about you. That’s why you suddenly start seeing ads for furniture stores after you looked online for new couches. Similarly, a social network can access information about searches you conducted outside the app and then show you ads based on those browsing habits, after you log in to the social network. Result? You guessed it, more furniture ads!

But in addition to this type of online activity data, there is much more information available about individuals. Much of this information is hiding in plain sight. Moreover, it may not intuitively seem relevant to the question at hand. When someone applies for a loan, for example, the lender might consider facts other than a credit score, including tidbits such as whether the potential borrower wrote in all capital letters on the application or how long he or she reviewed the terms and conditions of the credit requested. All of these nuggets constitute Big Data. (And no, we don’t really know why everyone capitalizes “Big Data,” but we are going to do it anyway.)

Analytics: The key that unlocks the secrets in the data

So how would a lender know to care about capital letters, you ask? This is where things get interesting.

An industry has developed around Big Data to harness and analyze all of this random information about people, a process known generally as “data analytics.” It can be used to evaluate the status of an existing project or approach. Using sophisticated models and algorithms, organizations can also use data analytics to review data and predict certain outcomes. In the human resources context, this type of “predictive analytics” is often known as “people analytics.”

An example helps: A computer software company wanted to recruit software engineers. In addition to the application materials provided by actual applicants, the company used algorithms, designed for its needs, to scour the internet for programmers and gather more information about them generally. That additional information included:

  • Identifying useful responses posted by particular coders in the questions-and-answers feature of a popular coding forum; and
  • Reviewing the words used by coders on social media, with the understanding that certain phrases distinguished expert programmers from lesser candidates.

The company further knew that a coder’s interest in a specific Japanese manga (comic book) site somehow correlated to his or her abilities as a software engineer. Analyzing this data — combed from publicly-available sources — allowed the company to identify proficient candidates that had not yet applied for its software engineering positions, but who should be the target of its recruiting efforts.

While the example focuses on candidate recruiting, employers can utilize people analytics throughout the employment life-cycle. Analytics can inform hiring, performance evaluation, and advancement planning and decisions. A bank in Asia, for example, conducted in-depth analysis of its workforce and learned that its highest-potential performers did not come from top academic institutions, as widely believed, but from a variety of schools and often with prior experience in certain “feeder” positions. As a further example, employers can use Big Data to combat attrition. By sifting through calendar entries and email headers, people analytics can tell which employees are likely to leave within the next year. These types of insights can guide decisions on how to keep such at-risk employees, including, for example, whether or how restructuring a team or offering training might improve worker retention.

Risks and rewards of using big data

Without a doubt, Big Data can be a powerful tool for employers. In fact, it is becoming more common for employers — especially larger employers with sufficient needs and data points — to use people analytics in making employment and strategic decisions. People analytics involves organizations combining and analyzing information — both internally held (i.e., performance reviews, sales figures, test and survey results, whether someone is usually late to work, how someone’s efforts affect a team, where long-term employees studied, etc.) and externally obtained (social media information, public postings, etc.) — to accurately predict outcomes and increase the efficiency of their decision-making.

As your HR director probably heard, there are benefits to using people analytics. As we summarized above, Big Data can help employers make better-informed decisions. Those decisions, in turn, can increase corporate and individual performance, as well as transparency, accountability, and morale. Moreover, Big Data methods can eliminate some of the pitfalls associated with human decision-making, such as overt or unconscious bias.

It is possible for Big Data to remove such biases from an employer’s decisions because it focuses on objective data and on proven correlations. In the software engineer example above, it did not matter why great programmers like a specific Japanese manga website. Nor did it matter what the ideal candidates look like, where they went to school, where they were born, etc. The correlation (even if inexplicable) between the proficiency of the coder and the manga website was all that mattered. The algorithms take these neutral pieces of information and factor them in to the recruiting process, elevating objective criteria over subjective impressions. Data analytics thus may be used to improve equal employment opportunity.

As you suspected, however, there are risks associated with the use of Big Data in the employment context. The algorithms themselves may unintentionally perpetuate discrimination, depending on the specific data used to train the algorithms, how specific data points are weighted, and how the algorithms operate. That is, some element embedded in the algorithm could, in theory, produce a discriminatory effect on a group of people. For example, an algorithm might rely too heavily on an individual’s participation in social media, which varies based on a number of protected characteristics, including age. Or, “[h]ypothetically, an algorithm that relies on a correlation between fast programmers and those that drive fast cars will reject applicants who are too poor to own a car, who live within walking distance of their usual haunts, or who have a disability that prevents them from driving.” Additionally, demographic or sensitive information, such as genetic data, could be included in the analytics.

Article Continues Below

These concerns have not gone unnoticed. In 2016, the Equal Employment Opportunity Commission (EEOC) empaneled a group of experts to discuss the use of data analytics in the workplace. Based on the panel’s testimony, the consensus is that Big Data is here to stay but must be wielded with caution. Similarly, the Federal Trade Commission issued a report highlighting the strengths and weaknesses of using Big Data in numerous fields, such as marketing and human resources. Neither agency has indicated yet that it will issue formal guidance or regulations on the use of Big Data, but employers should be aware that this burgeoning topic is on the radar of regulatory agencies. And if algorithms unwittingly produce a disproportionately negative impact on certain subsets of people (i.e., women, racial or other minorities, individuals with disabilities, etc.), an employer could be subject to scrutiny and litigation under federal, state, or local civil rights laws.

Some practical suggestions

Are you still with us, Dear Luddite? If you’ve made it through the technical exposition above, we have few practical suggestions for you and your employer. We are also happy to help with more specific questions, when the time comes.

On the whole, employers should strive to become educated, responsible consumers and users of data analytics products and services. Before implementing people analytics, employers should carefully consider and specifically identify what questions they hope to answer and/or what problems they wish to solve using this type of technology. The more specifically the question and/or problem is understood and articulated, the more likely analytics can provide useful insight. Bear in mind that, at some point, human decision makers will need to be involved, especially because data scientists may not be familiar with equal employment requirements and other legal duties. Employers might want to evaluate legal risks prior to deploying these technologies, ideally in an attorney-client privileged setting.

In choosing a vendor and an analytics approach, employers should not simply follow their competitors. It’s important for organizations to think independently and to ask a lot of questions about the process, its purpose, and its reliability. Employers should ask, for example, what safeguards will be used to avoid discriminatory output and how the results can be validated. Employers should be vigilant, even if this requires an uncomfortable pestering for detail because, ultimately, the employer is responsible for ensuring its workplace is free from illegal discrimination.

This was originally published on Littler Mendelson’s website. © 2018 Littler Mendelson. All Rights Reserved. Littler®, Employment & Labor Law Solutions Worldwide® and ASAP® are registered trademarks of Littler Mendelson, P.C.

Aaron Crews

As Littler Mendelson's first Chief Data Analytics Officer, Aaron D. Crews brings an extensive background focused on the intersection of technology, business and the law.

Aaron manages the operations of Littler’s data analytics practice and leads the development and implementation of technology-based solutions that provide value to the firm and its clients. He also uses data analytics tools to facilitate process improvements and efficiencies, and assist in protocols for the secure handling of data.

He also has extensive experience conducting pay equity audits for all types of employers, from start-ups to Fortune 50 companies, and helped develop the Littler Pay Equity Assessment™, including counseling employers on a broad range of state and federal issues related to pay equity, from compliance, to updating policies and job descriptions, to training managers and recruiters, and more.

Aaron previously served as one of the firm's eDiscovery counsel. In that capacity, he also assisted with developing strategies for efficient and effective data harvesting, review and production, and implementing cost-shifting and cost reduction strategies.

Marko J. Mrkonich

Marko J. Mrkonich is a shareholder in the Minneapolis office of Littler Mendelson. He focuses his practice on discrimination and other employment litigation, client counseling, and traditional labor law issues. He works primarily with clients in the manufacturing and retail industries.

Marko represents clients in federal and state jury and bench trials, in appellate courts and in administrative hearings before federal and state agencies, including the Equal Employment Opportunity Commission, the National Labor Relations Board, the Department of Labor and the Minnesota Department of Human Rights. He also handles labor arbitrations and has extensive experience in mediation and other forms of alternative dispute resolution.

Marko also has extensive experience conducting pay equity audits for all types of employers, from start-ups to Fortune 50 companies, and helped develop the Littler Pay Equity Assessment™, including counseling employers on a broad range of state and federal issues related to pay equity, from compliance, to updating policies and job descriptions, to training managers and recruiters, and more.

Previously Littler Mendelson's president and managing director, Marko was also the founding member of the Minneapolis office where he formerly held the position of office managing shareholder.