Most of us encounter the use of analytics in our everyday lives and give little thought to its use. Have you ever applied for a credit card or loan and were asked to provide a list of your outstanding financial obligations? Or, perhaps you applied for health insurance and were required to provide a summary of your health history. Providers request this information to help determine whether you are credit worthy or insurable based on analysis of others with similar histories. Welcome to the world of analytics.

But what about the use of analytics to manage the workplace? Imagine being able to predict which of several hundred job applicants are most likely to be successful on the job. Or being able to predict which employees are most likely to leave the organization in the future, or worse, file a charge. Analytics can be used to assess employee engagement, and it can even be used to optimize employee development initiatives.

Leveraging workplace analytics in this way may help companies streamline processes, resulting in saved time and money. But are there risks? Several reports from agencies such as the Federal Trade Commission and the White House have warned of the risk of making biased decisions based on analytics. Last Fall, the Equal Employment Opportunity Commission even held a public meeting regarding the use of big data in employment during which it examined the risks and benefits of big data analytics in the workplace.

Despite risks, properly designed analytics platforms can yield a host of benefits and may significantly lessen the likelihood of liability. Of course, algorithms used by employers to make decisions could be tainted by bias – for example, race and gender could be incorporated into an algorithm used by company officials to determine who should be hired or promoted. Even if race and gender is not explicitly included, an algorithm could result in the unintentional disproportionate exclusion of a particular race or gender group, that is, disparate impact. But these concerns also exist absent the use of algorithms. Humans, but their very nature, bring unintentional biases reflecting their life’s experiences and intuition to everyday decisions. Humans also may bring inconsistency to the decision-making process. Properly designed analytics platforms based on neutral data science are highly consistent and efficient.

Indeed, algorithms should not be designed to explicitly incorporate protected characteristics such as race or gender. And employers must monitor their analytics use for evidence of disparate impact. The most effective of these platforms provide guidance and should never be solely relied upon by employers when making decisions.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Eric J. Felsberg Eric J. Felsberg

Eric J. Felsberg is a Principal in the Long Island, New York, office of Jackson Lewis P.C. and the National Director of JL Data Analytics Group.

As the National Director of JL Data Analytics Group, Mr. Felsberg leads a team of multi-disciplinary lawyers…

Eric J. Felsberg is a Principal in the Long Island, New York, office of Jackson Lewis P.C. and the National Director of JL Data Analytics Group.

As the National Director of JL Data Analytics Group, Mr. Felsberg leads a team of multi-disciplinary lawyers, statisticians, data scientists, and analysts with decades of experience managing the interplay of data analytics and the law. Under Mr. Felsberg’s leadership, the Data Analytics Group applies proprietary algorithms and state-of-the-art modeling techniques to help employers evaluate risk and drive legal strategy. In addition to other services, the team offers talent analytics for recruitment, workforce management and equity and policy assessments through predictive modeling, partners with employers in the design of data-driven solutions that comply with applicable workplace law, manages and synthesizes large data sets from myriad sources into analyzable formats, provides compliance assessment and litigation support services including damage calculations, risk assessments, and selection decision analyses, and offers strategic labor relations assistance including determination of long term costs of collective bargaining agreements, review of compliance with collectively bargained compensation plans and assessment of the efficacy of training programs. The JL Data Analytics Group designs its service delivery models to maximize the protections afforded by the attorney-client and other privileges.

Mr. Felsberg also provides training and daily counsel to employers in various industries on day-to-day employment issues and the range of federal, state, and local affirmative action compliance obligations. Mr. Felsberg works closely with employers to prepare affirmative action plans for submission to the Office of Federal Contract Compliance Programs (OFCCP) during which he analyzes and investigates personnel selection and compensation systems. Mr. Felsberg has successfully represented employers during OFCCP compliance reviews, OFCCP individual complaint investigations, and in matters involving OFCCP claims of class-based discrimination. He regularly evaluates and counsels employers regarding compensation systems both proactively as well as in response to complaints and enforcement actions.

Mr. Felsberg is an accomplished and recognized speaker on issues of workplace analytics and affirmative action compliance.

While at Hofstra University School of Law, Mr. Felsberg served as the Editor-in-Chief of the Hofstra Labor & Employment Law Journal.