I’m Legal Counsel To Our Diversity Office; What Do I Need To Know About Data Analytics?

As if you don’t have enough on your plate already. You just heard in the lunchroom that your Chief Diversity Officer is making a presentation to a trade group on the company’s workforce demographics. Should you care?

An often-overlooked area of data analytics is in a company’s diversity office. I say “overlooked” because some diversity offices do not feel the need to seek the legal department’s advice when they embark on gathering data about, for example, the demographics of the company. Whether you have been asked for advice, or have decided to seek out the diversity office to see what data it is compiling, you should be aware of the types of analytics typically prepared — and the legal implications. Let’s look at a few examples.

Workforce Demographics: Most diversity offices track workforce demographics. These analyses typically include all job groups, like EEO-1 job categories, and may be as granular as job titles. Most include race/ethnicity and gender, some include age, and others may include such factors as veteran status, sexual orientation, and disability, in part depending on the availability and reliability of the data in the HRIS. The analyses may be snapshots for a specific date, or time-trends for comparison purposes: “Are we improving year-to-year in our representation of Hispanic females at the mid-level manager position?” Either way, they typically result in pie charts, graphs, and other visual aids clearly displaying percentages of representation by race/ethnicity, gender, etc., sometimes with red-shaded warning signs for “high-risk” (or “high-opportunity”) areas. Are you shuddering yet?

Source: Walmart Road to Inclusion 2017 Report

Benchmarking: Diversity offices want to be able to answer their CEO’s question, “So how are we doing compared with our competitors?” Benchmarking is a relatively easy task so long as comparative data is available. That, however, is the rub. Although some do, most companies do not publish their workforce demographic data. Some periodicals, like DiversityInc, publish limited data on those recognized annually as their “Top 50 Companies for Diversity.” The list may include companies in your industry, and even if it doesn’t, it can be considered a “best in class” benchmark. There are other sources as well, such as industry groups, and the EEOC.

Be aware of the types of analyses conducted by your diversity office. If the office is not asking for your oversight or assistance, offer it.

Adverse Impacts: Government contractors are accustomed to preparing annual Affirmative Action Plans that include statistical analyses of potential “adverse impacts” of various employment processes, such as recruiting/hiring, performance evaluations, terminations, compensation, and promotions. These analyses can be critical in proactively uncovering problems and addressing them. Particularly with non-government contractors, diversity and human resources offices often conduct such analyses to determine the fairness and equity of corporate policies and practices. Like other self-critical analyses, they present risks if disclosed.

So what should you do? First, be aware of the types of analyses conducted by your diversity office. If the office is not asking for your oversight or assistance, offer it.

Second, identify potential legal and reputational implications of these analyses, and address them. You know that such data may be useful to enforcement agencies (EEOC, OFCCP), plaintiffs’ attorneys, disgruntled employees, activist shareholders, and media critics. You should take steps to establish and preserve the attorney-client privilege for such analyses as you would any other area of legal advice. You can also probe the accuracy and reliability of the methods used and results. To the extent your diversity office wants to publish the results of analyses beyond a “control group,” ensure that this business decision considers the legal and reputational risks.

Of course, if the analytical results are all good news, that problem goes away. Feel free to let your diversity office brag about it, in the C-suite and beyond!

Worried About a Pay Gap in Your Organization? Taking Action May be Less Daunting Than you Think.

Pay equity between men and women – and among different races – has long been a concern for employers who want to ensure they are paying people according to job-related reasons, in compliance with anti-discrimination laws, and in a way that aligns with the organization’s practices and philosophies. In the midst of the #MeToo and Time’s Up movements, pay equity has taken a front seat in the public dialogue.

More than ever, employers want to take a close look at their compensation. Employers may seek to identify and prevent Title VII liability; proactively analyze location-specific data that OFCCP would analyze in an audit; or determine liability in a specific state in this current climate of patchwork state legislation, for example.

But employers are not always sure where to begin.

The good news is the information needed for a robust pay analysis is usually already available, in the form of data collected for other business reasons. Examples of data that may be used include:

  • Hire date
  • Time in current position
  • Tenure
  • Geographic location
  • Exempt status
  • Job title
  • Job family
  • Job level
  • Cost center
  • Department
  • Number of direct reports
  • Whether acquired from another company
  • Base salary
  • Incentive pay
  • Bonus pay
  • Other pay

The precise type of pay analyses that might be conducted using some or all of these variables depends upon the employer’s unique issues and goals. Multiple linear regressions – the gold standard of pay analyses – are possible where groupings are sufficiently large, but other statistical tests can be effective as well.

Regressions can help employers identify unexplained pay differences, and identify monetary adjustments to address any gaps.

Were Analytics the Real MVP of the Super Bowl?

As the Eagles readied to celebrate the franchise’s first Vince Lombardi trophy, an unlikely candidate basked in the glow of being declared the game’s Most Valuable Player. Surely it was Nick Foles who, on his way to upsetting one of the NFL’s elite franchises threw and caught a touchdown in the same big game, was the true MVP. But was he?

In the days leading up to the Super Bowl, the New York Times published an article about how the Eagles leveraged analytics to secure a Super Bowl berth. The team relied, in part, on probabilistic models that leveraged years of play data to calculate likely outcomes, given a specific set of circumstances. They found that while enumerating outcomes and optimizing for success, the models would, in many cases, recommend plays that bucked the common wisdom. Indeed, we saw the Eagles run plays and make decisions throughout the season that, to the outside observer, may have seemed mind-boggling, overly-aggressive, or risky. Of course, the outside observer did not have access to the play-by-play analytics. Yet, in many instances, these data-driven decisions produced favorable results. So it seems that analytics were the real MVP, right? Well, not entirely.

As we have written in the past, the most effective analytics platforms provide guidance and should never be solely relied upon by employers when making decisions. This analytics concept rings as true in football as it does in business. The New York Times article talks about how mathematical models can serve to defend a playmaking decision that defies traditional football logic. For example, why would any team go for it on fourth and one, deep in their own zone, during their first possession in overtime? What if the analytics suggested going for it was more likely to result in success? If it fails, well, the football pundits will have a lot to talk about.

Coaches and players weigh the analytics, examine the play conditions, and gauge on-field personnel’s ability to perform. In order words, the team uses analytics as a guide and, taking into account other “soft” variables and experience, makes a decision that is right for the team at that time. This same strategy leads to success in the business world. Modern companies hold a wealth of data that can be used to inform decisions with cutting edge analytics, but data-driven insights must be balanced with current business conditions in order to contribute to success. If this balancing act works on the grand stage of professional football, it can work for your organization.

Indeed, we may soon see a day when football stars raise the Super Bowl MVP trophy locked arm-in-arm with their data science team. Until then, congratulations, Mr. Foles.

Talent Software – Perfect Solution or Perfect Storm?

Using talent-finder software to simplify hiring decisions is all the rage. Hiring managers across the country love the idea that one of their most difficult tasks – hiring – can now done through software.  So, what is good and what is risky when using these new hiring tools to evaluate talent?

Many job applicants today expect that their resumes will go through an initial computer scan of some type, but many people are not prepared for the fact that their initial interview may involve gaming that also will be scored by software. Meet Knack, a Silicon Valley company that is revolutionizing how employers evaluate potential talent.  Knack’s team of behavioral scientists and software and game designers and developers create video games that provide insights into applicants’ “knacks” – their values, behaviors, career potential and the like.

For example, as players engage in Knack’s app-based video game Wasabi Waiter, they are evaluated on their ability to deliver the right sushi order to the right customer as the restaurant becomes more crowded and the player also has to wash dishes, deliver menus, and keep patrons happy.  Similarly, Dungeon Scrawl players, are scored on their ability to navigate a maze and solve problems.  The software records how players solve problems, how long they hesitate before taking action, persistence, ability to prioritize, and so forth.  The end result reveals a mosaic of key attributes – empathy, perception, creativity, introvert or extrovert, ability to remain calm under pressure, risk taker or risk averse, and much more.  Indeed, the longer one plays, the more complete the picture will be.

It is evident that with such data, organizations can better decipher who is likely to be a better fit for the requirements of a particular position. Do you want to hire an introvert for a customer-facing sales position?  Should you hire a thrill-seeker as a police officer?  Should a bus driver be someone who is prone to take risks?  Many of these human characteristics are important to the positions, but very difficult to discern through resumes and traditional job interviews.  As analytics testing develops, “knacks” will appear in social medial profiles.  Talent evaluation through such innovative software products is still in its nascent stage, however, as such tools are produced, they must be carefully vetted to screen unlawful bias.

What should you be concerned about?  Disparate impact that disproportionately excludes protected groups, regardless of intent.

How should you protect yourself? Ask the software vendor what selection factors are used.  The further the selection factors get from the job requirements, the more problematic it can be from an adverse impact perspective.  In the discrimination context, you will need to show that the selections factors utilized were directly related to key elements of successful job performance.  So, do your homework upfront and only screen for important job-related criteria. Make sure to fully test talent analytics products and determine which lawful prospective employee data you should consider before you adopt the latest hiring tools that have the potential to be game changers.

Charting Your Course on the Data Analytics Highway

Should data analytics be used as a tool to uncover new insights from company data, or should it be used to answer or solve a specific business inquiry or problem? This question is central to any analytics project design. But like most important questions, the answer depends on several factors and is not altogether clear.

As terms like “big data”, “data science” and “artificial intelligence” continue to bleed into many aspects of our lives, a lot of companies are excited and more than willing to jump on to the data analytics highway, often without clear direction or purpose. The notion is that even without clear direction, a seasoned data scientist can use sophisticated analytical tools to uncover powerful business insights from web-scale data and internal caches, and in turn, drive company success. While such an outcome is possible, it is not likely.

Many successful analytics endeavors begin with a specific business question or problem in hand. Quantitative tools are then used to efficiently and accurately answer the question or solve the problem. A more precise understanding of the business problem immediately informs the analytics team about what data are needed to arrive at a solution. This small bit of clarity not only helps direct data collection, data maintenance, and data generating initiatives, it also ensures that your company has the data required to quantitatively address some of the most important and pertinent business needs for the foreseeable future.

But for companies looking to use data as a way to ask new questions or to discover unexamined business problems, exploratory data analysis may be valuable. While perhaps risky, analytics projects and initiatives designed to generate more questions than they answer can lead to unexpected knowledge, and valuable business insight. For example, if a company is unaware of costly staffing inefficiencies, exploratory data analysis is one of the few ways to unexpectedly illuminate the issue, and at the same time provide a solution. If problems are never identified, then they can never be resolved.

So, should analytics be used to solve a specific problem, or should it be used to uncover new insights? Of course, the answer is that both avenues can be valuable, and they serve different purposes and pose different risks and rewards. We can think of the dichotomy in the context of a highway, let’s call it our data highway.  Imagine getting in your car and driving along the highway without a clear destination or purpose. Along the way, you might see some new and interesting things. A new restaurant, a new park maybe. You might also see nothing at all of interest, and your time might have been wasted. What’s worse is that you’ve already paid for the gas. If you had started your journey on the data highway with clear destination, you might eventually get there. If there are road blocks, at least you’ve identified them and can, as a consequence, chart alternative routes.

Lawyers and Data Analytics?

Recently, we presented a program at a well-known analytics conference and set up an informational booth to meet attendees. Several attendees, most of whom were data scientists, approached our booth with inquisitive looks on their faces and asked, “Why is a law firm at an analytics conference?” Good question.

We explained that Jackson Lewis, as a leading workplace law firm, started a data analytics group dedicated to helping employers manage their workplaces using data–driven solutions. Our team is made up of a multidisciplinary team of lawyers who have long advocated on behalf of our clients using data analytics as well as data scientists and statisticians who help our clients manage their workplaces by leveraging the data they already maintain. Combining our data analytics capabilities with our collective knowledge of workplace law, we are well-equipped to provide clients with industry leading representation. These services include advice and counsel around the proper design and implementation of workplace analytics platforms. Oh, and we possess the ability to cloak analyses in privilege and mitigate the risk of disclosure. The inquisitive look then turns to interest – tell me more.

Attorney-client privilege generally applies to communications between an attorney and a client concerning legal advice. The privilege generally does not apply to underlying facts or data. So, while the privilege may apply to the analyses, it would not apply to the underlying data. Without an attorney present, the communications are not subject to privilege. While the privilege is maximized using outside counsel, there are intermediate levels of possible protection when in-house attorneys are involved. So, why does it matter?

It matters because modern database systems allow employers to maintain a treasure trove of data that may be retrieved through a few simple keystrokes. While this information can prove incredibly valuable to employers trying to optimize operations, streamline hiring, and assess employee engagement, etc., these data can be fodder for a discrimination claim. Plaintiffs and enforcement agencies increasingly are asking for copies of analyses as part of suits and investigations. Additionally, shareholders and “activist investors” may demand publication of different data points about a company – e.g., diversity and inclusion statistics. Especially in the time leading up to litigation, possessing the ability to perform analyses while maximizing the protections against disclosure is incredibly powerful.

Yes, that was a good question.


Workplace Analytics: What Does the Law Say?

Hopefully by now you have recognized the benefits of workplace analytics and are becoming more comfortable understanding any associated risk. Does the law provide any guidance? As we discussed in a prior blog entry, some government agencies have issued reports, weighing in on the use of workplace analytics. The Equal Employment Opportunity Commission also held a public meeting last Fall regarding the use of workplace analytics. To be sure, the law around the use of analytics is still developing and it is likely that we will see additional guidance in the future.

For now, employers must look to the existing body of law when designing analytics platforms. They must consider issues of disparate treatment, disparate impact, prohibitions against certain pre-employment inquiries, and data security concerns, to name a few. There has been concern expressed by some, including the EEOC, about the presence of bias in employee selection algorithms. But there has been little recent guidance provided to employers about how to address this issue. And there may not need to be.

In 1978, a group of federal agencies, including the EEOC and Department of Labor joined to issue the Uniform Guidelines on Employee Selection Procedures (UGESP) to “incorporate a single set of principles which are designed to assist employers, labor organizations, employment agencies, and licensing and certification boards to comply with requirements of Federal law prohibiting employment practices which discriminate on grounds of race, color, religion, sex, and national origin. They are designed to provide a framework for determining the proper use of tests and other selection procedures.”

While UGESP may not have been drafted with the use of modern analytics platforms in mind, they provide a roadmap for the employer using modern analytics programs to effectuate employee selection decisions. Use of analytics platforms to make selection decisions likely fall under UGESP and its application to “other selection procedures.” Generally, under UGESP, when a selection procedure has an adverse impact against any protected group, that selection procedure must be validated. While there are a few ways to properly validate a selection procedure (here, an algorithm), generally, to be valid, a selection procedure must be shown to be legitimate and consistent with performance of the job. And employers must ensure that there are no other selection mechanisms that may be used and cause less impact.

So, while UGESP was drafted more than thirty years ago, it may just be the roadmap employers have been looking for when designing cutting-edge analytics platforms. In a sense UGESP is timeless.

Striking a Balance: Managing the Workplace with Data-Driven Solutions

Most of us encounter the use of analytics in our everyday lives and give little thought to its use. Have you ever applied for a credit card or loan and were asked to provide a list of your outstanding financial obligations? Or, perhaps you applied for health insurance and were required to provide a summary of your health history. Providers request this information to help determine whether you are credit worthy or insurable based on analysis of others with similar histories. Welcome to the world of analytics.

But what about the use of analytics to manage the workplace? Imagine being able to predict which of several hundred job applicants are most likely to be successful on the job. Or being able to predict which employees are most likely to leave the organization in the future, or worse, file a charge. Analytics can be used to assess employee engagement, and it can even be used to optimize employee development initiatives.

Leveraging workplace analytics in this way may help companies streamline processes, resulting in saved time and money. But are there risks? Several reports from agencies such as the Federal Trade Commission and the White House have warned of the risk of making biased decisions based on analytics. Last Fall, the Equal Employment Opportunity Commission even held a public meeting regarding the use of big data in employment during which it examined the risks and benefits of big data analytics in the workplace.

Despite risks, properly designed analytics platforms can yield a host of benefits and may significantly lessen the likelihood of liability. Of course, algorithms used by employers to make decisions could be tainted by bias – for example, race and gender could be incorporated into an algorithm used by company officials to determine who should be hired or promoted. Even if race and gender is not explicitly included, an algorithm could result in the unintentional disproportionate exclusion of a particular race or gender group, that is, disparate impact. But these concerns also exist absent the use of algorithms. Humans, but their very nature, bring unintentional biases reflecting their life’s experiences and intuition to everyday decisions. Humans also may bring inconsistency to the decision-making process. Properly designed analytics platforms based on neutral data science are highly consistent and efficient.

Indeed, algorithms should not be designed to explicitly incorporate protected characteristics such as race or gender. And employers must monitor their analytics use for evidence of disparate impact. The most effective of these platforms provide guidance and should never be solely relied upon by employers when making decisions.

So What is Data Analytics Anyway?

(And while we’re at it…what are Big Data, Business Intelligence, Artificial Intelligence, Data Science and IoT?)

To the newly initiated, introducing one’s self to the field data analytics can be intimidating. Navigating through a dizzying array of terms can be a difficult and tedious task. In this post, we bring to you a brief laymen’s glossary to many of the new words and phrases that are sure to become a part of your everyday vocabulary.

Data Analytics – In its most basic form, Data Analytics refers to the practice of using data to draw conclusions that may help inform a decision or a future business practice. One type of data analytics,  Predictive Analytics, refers to the practice of using data collected about past events to predict the likelihood of various possible future events. For example, employers may use predictive analytics to predict who is most likely to leave their organization in the future based on an analysis of the characteristics of those who have left their organization in the past. Still confused? Watch Moneyball® –  it’s a fantastic movie.

Big Data – Perhaps the term that is thrown around with most abandon, Big Data refers to massive collections of data that, due almost entirely to their volume, require special methods and technologies to manage and analyze them.  This term is often used generically to describe large or complex data sets.

Business Intelligence – Generally refers to the tools and methods used by an organization to analyze data from various sources for the purposes of optimizing business decisions. For example, a company may analyze the nature and source of its revenue stream to better inform sales strategies.

Artificial Intelligence  – Phrase often used to describe complex processes or systems that are capable of performing tasks that are typically thought of as requiring human intervention or intelligence. An example includes speech recognition. Don’t believe us? Ask Siri® or Alexa®.

Data Science –A broad and highly interdisciplinary field of scientific inquiry that relies heavily on quantitative tools and methodologies to better understand the natural world. Data scientists are practitioners of data science, and are typically employed by organizations and companies, like Jackson Lewis, wishing to leverage available data to help manage process more efficiently, assist in decision-making, develop new products powered by complex statistical algorithms, or to develop entirely new algorithms and ideas.

IoT, or the Internet of Things – A shorthand way of referring to the interconnectivity of numerous devices over the internet. It may include computers, cell phones, or any other device that today may be connected to the internet, such as refrigerators, air conditioners, and other household appliances. Have you ever remotely set your house alarm from your smart phone?  Congratulations, you have experience with IoT!

Data Intelligencer Reporter – An insightful new blog about workplace data analytics brought to you by Jackson Lewis’ Data Analytics Group.

The Future of Your Workforce Has Arrived: Welcome to the Data Intelligence Reporter

Since its founding more than 50 years ago, Jackson Lewis has prided itself on delivering first class legal services, cutting edge preventive strategies and positive solutions to some of the most challenging workplace law challenges. As the workplace has evolved, so too has Jackson Lewis’ award-winning services; it is in this tradition that Jackson Lewis has established a Data Analytics Group. The Group is comprised of a multidisciplinary team of lawyers, data scientists, and statisticians that help clients effectively manage the workplace with data-driven solutions.  In all matters, we combine our legal knowledge with powerful analytics insights to provide clients with critical information, while still maximizing the protections afforded by the attorney-client privilege. We provide industry-leading data analytics services to help optimize recruitment practices, assess employee engagement, predict attrition and future headcount needs, evaluate potential liability, and provide novel advice and counsel services about the the proper use and design of analytics platforms. Our clients benefit from the powerful combination of our attorneys’ collective decades-long experience and our data team’s modern analytics skills.

From minimizing legal risk to improving planning and decision-making, it is crucial that employers recognize the value in leveraging data as a management tool. With the launch of the Data Intelligence Reporter, our Data Analytics Group will provide timely, insightful, and practical insights into managing the workplace using data-driven solutions.  It will change the way you think about managing the workplace.  The future of your workplace has arrived.  We hope you will join us.