By: Samantha Rhoads & Michael Cortes[1]

The COVID-19 pandemic has put a spotlight on statistical terms often unfamiliar to anyone but such professionals as statisticians and data scientists. Terminology such as infection rates, “flattening the curve,” and related statistical information are now being used as slogans and hashtags. This post offers a brief explanation of data analytics and the related math for a better understanding of statistics and data.

Being inundated with information and government-provided mathematical models and projections makes it difficult for citizens to know what statistics to trust. Statistics helps make sense of data. The accuracy of the insights, though, is limited by the quality and accuracy of the data itself. From one data source, the COVID-19 fatality rate calculated may be 4.28%. From another source, the rate is actually 0.66%. The fundamental reasons for these different numbers are:

  1. The numbers are based on different datasets that were collected using different methods.
  2. The statistical models applied to the data are based on different assumptions.

How do you know what to rely upon? This is where the human element comes into the picture. When given data and statistics, be skeptical. Apply common sense and trust your own thinking, and find other research done from other sources to get a more complete picture. Understand the assumptions being made and how the data was collected.

People can mentally process trends in the data in order to take precautions and understand how the world is being shaped. Looking directly at the numbers is vital in understanding COVID-19’s path, and applying good sense and judgment is critical.

While the data around COVID-19 is a critical issue now, the same data and statistical concepts apply to data sets in various areas. For example, the same critical thinking should be applied against workplace trends and data as is being done with the COVID-19 trends. Work lives are more unclear at this time, which means defining patterns and understanding the nuances in the data is critical.

Companies’ employee data should be harnessed and explained. This means, for example, if a company wants to understand the diversity of its workforce, it needs to be careful about how it is computing those numbers. Pay special attention to the way the data is collected and what the variables mean. It is easy to be led astray if the company is computing the wrong numbers or interpreting the results incorrectly.

While data can be used to make inferences, there are caveats. Numbers “don’t lie,” but they may be taken out of context. The assumptions around the calculations and the data usually are not fully understood. Keep the reliability of the data in mind and ensure the circumstances of a pattern are considered before purporting it as something meaningful.


[1] Samantha Rhoads and Michael Cortes are Data Scientists in the Firm’s Data Analytics Group.

Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Eric J. Felsberg Eric J. Felsberg

Eric J. Felsberg is a Principal in the Long Island, New York, office of Jackson Lewis P.C. and the National Director of JL Data Analytics Group.

As the National Director of JL Data Analytics Group, Mr. Felsberg leads a team of multi-disciplinary lawyers…

Eric J. Felsberg is a Principal in the Long Island, New York, office of Jackson Lewis P.C. and the National Director of JL Data Analytics Group.

As the National Director of JL Data Analytics Group, Mr. Felsberg leads a team of multi-disciplinary lawyers, statisticians, data scientists, and analysts with decades of experience managing the interplay of data analytics and the law. Under Mr. Felsberg’s leadership, the Data Analytics Group applies proprietary algorithms and state-of-the-art modeling techniques to help employers evaluate risk and drive legal strategy. In addition to other services, the team offers talent analytics for recruitment, workforce management and equity and policy assessments through predictive modeling, partners with employers in the design of data-driven solutions that comply with applicable workplace law, manages and synthesizes large data sets from myriad sources into analyzable formats, provides compliance assessment and litigation support services including damage calculations, risk assessments, and selection decision analyses, and offers strategic labor relations assistance including determination of long term costs of collective bargaining agreements, review of compliance with collectively bargained compensation plans and assessment of the efficacy of training programs. The JL Data Analytics Group designs its service delivery models to maximize the protections afforded by the attorney-client and other privileges.

Mr. Felsberg also provides training and daily counsel to employers in various industries on day-to-day employment issues and the range of federal, state, and local affirmative action compliance obligations. Mr. Felsberg works closely with employers to prepare affirmative action plans for submission to the Office of Federal Contract Compliance Programs (OFCCP) during which he analyzes and investigates personnel selection and compensation systems. Mr. Felsberg has successfully represented employers during OFCCP compliance reviews, OFCCP individual complaint investigations, and in matters involving OFCCP claims of class-based discrimination. He regularly evaluates and counsels employers regarding compensation systems both proactively as well as in response to complaints and enforcement actions.

Mr. Felsberg is an accomplished and recognized speaker on issues of workplace analytics and affirmative action compliance.

While at Hofstra University School of Law, Mr. Felsberg served as the Editor-in-Chief of the Hofstra Labor & Employment Law Journal.