The launch of “EEOC Explore” is the U.S. Equal Employment Opportunity Commission’s (EEOC) latest step toward its vision to “build a 21st century data and analytics organization.”

In December 2018, the EEOC created the Office of Enterprise Data and Analytics (OEDA). Since its inception, the OEDA has been working on modernizing EEO data availability. As part of its December 2, 2020 launch of EEOC Explore, the agency demonstrated the features of its new tool, which furthers OEDA’s vision by working to modernize EEO data availability.

A data query and mapping tool, EEOC Explore aggregates publicly available EEO-1 data (currently limited to EEO-1 data sets from 2017 and 2018) into a series of interactive dashboards. The OEDA is expected to further develop and expand the size, scope, and functionality of EEOC Explore as its work continues.

EEOC Explore visualizes employment characteristics and information without identifying any employer or employee confidential information, protecting the EEO-1 data that was previously submitted by employers. The underlying EEO-1 data is then presented to users through five primary dashboards:

(i) job categories by gender;

(ii) race/ethnicity by gender;

(iii) industry;

(iv) state trends by year; and

(v) comparison by state.

Once a user selects a dashboard, they can filter the applicable data by year, geography, sex, race/ethnicity, job category, NAICS code (2-digit), and NAICS code (3-digit) to view more granular data. Within each dashboard, users can view the data’s aggregated totals by category and use filters to break down the information by race, gender and job category, or focus in on one specific race, gender, and job category.

During the demonstration, the OEDA used the example of comparing employment in the finance industry between Silicon Valley and the Atlanta Metro area, starting with the Industry dashboard, then filtered by the year (2018), the Silicon Valley and Atlanta Metro geographic areas, and the finance and insurance NAICS codes.

EEOC Explore is a useful, ready-made tool, but employers understandably may require context and more in-depth analysis to meet the legal and operational needs of their organization.

The Jackson Lewis Data Analytics Group is available to answer any questions about OEDA and the use and application(s) of EEO-1 information, and help make sense of the data you already hold.

The COVID-19 pandemic has affected the workplace in ways we could not have imagined just a few months ago. Indeed, the economic impact caused by the COVID-19-related shutdowns has prompted many employers to reevaluate how to conduct business and the number of employees they need. Consequently, many employers have been compelled to consider reductions in force. While not limited to times of pandemic, the basic principles apply.

Commonly, employers approach the exercise with a plan in mind: Which business lines are affected? How much payroll do we need to shed? What are the criteria for selection? Which employees will be selected for reduction? Who will make the selections? And more …. In addition, employers should consider the impact of their selections on groups of individuals of a certain demographic.

Considering Title VII of the Civil Rights Act of 1964, for example, employers should ask whether the rate at which females are selected for reduction is greater than the rate at which males are selected? How are minority employees faring against non-minority employees? And, considering the Age Discrimination in Employment Act, what about older employees as compared to younger employees? Additionally, are the differences in rates statistically significant? A disparate impact analysis can help answer these questions and possibly substantially mitigate risk for employers.

What if there is statistically significant disparate impact against a particular demographic? Confirm the criteria used to select the individuals for reduction are “consistent with business necessity” under Title VII. If a criterion does not meet the standard, revisit the selection(s).

Remember, intent need not come into the equation as these analyses examine the impact facially neutral selection criteria may be having on certain demographic groups. Of course, protected characteristics generally should not be included among the criteria being used to select individuals for selection. Be mindful of the importance and applicability of the attorney-client privilege when conducting these analyses and consider retaining a law firm to help.

Artificial intelligence (AI) technology that uses algorithms to assist in decision-making offers tremendous opportunity to make predictions and evaluate “big data.” The Federal Trade Commission (FTC), on April 8, 2020, provided reminders in its Tips and Advice blog post, Using Artificial Intelligence and Algorithms.

This is not the first time the FTC has focused on data analytics. In 2016, it issued a “Big Data” Report. See here.

AI technology may appear objective and unbiased, but the FTC warns of the potential for unfair or discriminatory outcomes or the perpetuation of existing socioeconomic disparities. For example, the FTC pointed out, a well-intentioned algorithm may be used for a positive decision, but the outcome may unintentionally disproportionately affect a particular minority group.

The FTC does not want consumers to be misled. It provided the following example: “If a company’s use of doppelgängers – whether a fake dating profile, phony follower, deepfakes, or an AI chatbot – misleads consumers, that company could face an FTC enforcement action.”

Businesses obtaining AI data from a third-party consumer reporting agency (CRA) and making decisions on that have particular obligations under state and federal Fair Credit Reporting Act (FCRA) laws. Under FCRA, a vendor that “assembles” consumer information to automate decision-making about eligibility for credit, employment, insurance, housing, or similar benefits and transactions may be a “consumer reporting agency.” An employer relying on automated decisions based on information from a third-party vendor is the user of that information. As the user, the business must provide consumers an “adverse action notice” required by FCRA if it takes an adverse action against the consumer. The content of the notice must be appropriate to the adverse action, and may consist of a copy of the “consumer report” containing AI information, the federal summary of rights, and other information. The vendor that is the CRA has an obligation to implement reasonable procedures to ensure the maximum possible accuracy of consumer reports and provide consumers with access to their own information, along with the ability to correct any errors. The FTC is seeking transparency and the ability to provide well-explained AI decision-making if the consumer asks.

Takeaways for Employers

  • Carefully review use of AI to ensure it does not result in discrimination. According to the FTC, for credit purposes, use of an algorithm such as a zip code could result in a disparate impact on a particular protected group.
  • Accuracy and integrity of data is key.
  • Validation of AI models is important to minimizing risk. Post-validation monitoring and periodic re-validation is important as well.
  • Review whether federal and state FCRA laws apply.

Continue self-monitoring by asking:

  • How representative is your data set?
  • Does your data model account for biases?
  • How accurate are your predictions based on big data?
  • Does your reliance on big data raise ethical or fairness concerns?

The FTC’s message: use AI, but proceed with accountability and integrity. Jackson Lewis’ Data Analytics Group is available to discuss questions about validation and data integrity.

By: Samantha Rhoads & Michael Cortes[1]

The COVID-19 pandemic has put a spotlight on statistical terms often unfamiliar to anyone but such professionals as statisticians and data scientists. Terminology such as infection rates, “flattening the curve,” and related statistical information are now being used as slogans and hashtags. This post offers a brief explanation of data analytics and the related math for a better understanding of statistics and data.

Being inundated with information and government-provided mathematical models and projections makes it difficult for citizens to know what statistics to trust. Statistics helps make sense of data. The accuracy of the insights, though, is limited by the quality and accuracy of the data itself. From one data source, the COVID-19 fatality rate calculated may be 4.28%. From another source, the rate is actually 0.66%. The fundamental reasons for these different numbers are:

  1. The numbers are based on different datasets that were collected using different methods.
  2. The statistical models applied to the data are based on different assumptions.

How do you know what to rely upon? This is where the human element comes into the picture. When given data and statistics, be skeptical. Apply common sense and trust your own thinking, and find other research done from other sources to get a more complete picture. Understand the assumptions being made and how the data was collected.

People can mentally process trends in the data in order to take precautions and understand how the world is being shaped. Looking directly at the numbers is vital in understanding COVID-19’s path, and applying good sense and judgment is critical.

While the data around COVID-19 is a critical issue now, the same data and statistical concepts apply to data sets in various areas. For example, the same critical thinking should be applied against workplace trends and data as is being done with the COVID-19 trends. Work lives are more unclear at this time, which means defining patterns and understanding the nuances in the data is critical.

Companies’ employee data should be harnessed and explained. This means, for example, if a company wants to understand the diversity of its workforce, it needs to be careful about how it is computing those numbers. Pay special attention to the way the data is collected and what the variables mean. It is easy to be led astray if the company is computing the wrong numbers or interpreting the results incorrectly.

While data can be used to make inferences, there are caveats. Numbers “don’t lie,” but they may be taken out of context. The assumptions around the calculations and the data usually are not fully understood. Keep the reliability of the data in mind and ensure the circumstances of a pattern are considered before purporting it as something meaningful.

 

[1] Samantha Rhoads and Michael Cortes are Data Scientists in the Firm’s Data Analytics Group.

As employers seek to make increasingly efficient and “better” hiring decisions, avoid biases, and increase workforce diversity, they are turning to, or considering, a growing range of technological tools. Essentially, these tools help employers efficiently identify qualified candidates, narrow the pool of job seekers, and predict who may be the “best” hire. As is often the case, however, technological advances may outpace our ability to keep up, and our ability to predict and understand the legal implications of such technological advances. In particular, such tools may be facially neutral but have an adverse impact on particular groups in violation of Title VII of the Civil Rights Act, Executive Order 11246, as well as other federal and state anti-discrimination laws.

In a recent article, Upturn – a non-profit organization that “promotes equity and justice in the design, governance, and use of digital technology” – evaluates the full range of “predictive hiring tools” with a view towards exploring “how predictive tools affect equity throughout the entire hiring process.”[1] The Upturn article ominously concludes that, “without active measures to mitigate them, bias will arise in predictive hiring tools by default.”

As with any type of “test” or selection tool, we agree employers should proactively understand how such tests and tools work, and what unintentional consequences they may produce.

Upturn points out that vendors have developed predictive tools for every stage of the hiring process: from sourcing to screening to interviewing to selection, and beyond hiring to performance evaluations. However, to appreciate the associated risks, Upturn points out employers should first understand how biases might be inherent in these tools – even those tools that vendors tout as removing bias from the hiring process.

When we think of bias in the employment context, we naturally think of “interpersonal bias” – an explicit or implicit (cognitive) bias an individual may have for/against characteristics of other individuals, including race, gender, color and other legally protected, immutable characteristics. In summary, however, even where predictive tools take steps to eliminate interpersonal bias, predictive tools may unknowingly create, perpetuate or exacerbate existing institutional and/or systemic biases. As the Upturn authors suggest,

Without active measures to mitigate them, biases will arise in predictive hiring tools by default. But predictive tools could also be turned in the other direction, offering employers the opportunity to look inward and adjust their own past behavior and assumptions. This insight could also help inform data and design choices for digital hiring tools that ensure they promote diversity and equity goals, rather than detract from them. Armed with a deeper understanding of the forces that may have shaped prior hiring decisions, new technologies, coupled with affirmative techniques to break entrenched patterns, could make employers more effective allies in promoting equity at scale.

While the technology may be new, efforts to address adverse impact in employment selection decisions is not. Title VII deems any employment selection procedure that has an adverse impact on the hiring of any race, sex, or ethnic group to be illegally discriminatory, unless the procedure has been “validated” in accordance with the Uniform Guidelines on Employment Procedures (UGESP), and no less-impactful procedure exists as an alternative. In general, a selection procedure or test is valid if it tests for job seeker attributes that are highly correlated with success in a particular job. The term “selection procedure” includes any procedure used to narrow the pool of job seekers for hire – from eligibility questions, to resume screens to interviews. Thus, most predictive hiring tools are subject to the UGESP.

However, the Upturn article points out that validation of predictive hiring tools may not be enough to identify tools that in fact fall prey to institutional and systemic biases. Employers and vendors may also need a more in-depth understanding and analysis of a tool, and how it works for any given employer.

While employees, plaintiff attorneys and the EEOC may not, yet, be fully active in this arena, they are aware of the issues. EEOC began publically addressing the implications of this issue in 2016. Moreover,   as discussed in an earlier blog post, EEOC recently launched an Office of Enterprise Data and Analytics with the capability of providing analytical support for systemic discrimination investigations.

Of course, regardless of legal implications, a seemingly effective hiring tool that is, without our knowledge, creating or perpetuating unintentional biases in your company’s hiring process may prove to be counter-productive.

What steps should employers take now to protect themselves from legal risk? The Upturn article provides a set of specific, technical “Guiding Questions” the authors found themselves “needing to answer…before we could even begin to think about the equity implications of a given tool.”

Of course, the first question to ask is whether your company is using or considering the use of any predictive tools. If so, our Data Analytics group may be able to provide assistance.   The Data Analytics Group is a multi-disciplinary team of lawyers, statisticians, data scientists, and analysts with decades of experience managing the interplay of data analytics and the law. For more information, please contact your Jackson Lewis attorney or Eric Felsberg, the National Director of the JL Data Analytics Group.

[1] Help Wanted – An Examination of Hiring Algorithms, Equity and Bias, Miranda Bogen and Aaron Rieke (Upturn, December 2018).

The Equal Employment Opportunity Commission (EEOC) has established the Office of Enterprise Data and Analytics (OEDA) to “provide [their] customers timely, accurate, and bias-free data and information to prevent and remedy unlawful employment discrimination, and improve organizational performance.” EEOC Director and Chief Data Officer Samuel C. “Chris” Haffer leads the OEDA.

This is an exciting and forward-looking development. Jackson Lewis’ Data Analytics Group was invited to EEOC’s headquarters in Washington, D.C. to attend a listening session about the new Office.

Employers have access to a wealth of data that, when paired with powerful analytical tools, can help them more effectively manage the workplace. In order to remain competitive, reduce workplace management costs, and mitigate workplace-related legal risk, among other things, it is critical for employers to leverage their own data in combination with external sources.

The launch of the OEDA should prompt employers to reconsider waiting to leverage data and analytics in managing their workplace. Using data and analytics in the workplace is not a passing fad.

With a vision to “build a 21st century data and analytics organization,” the EEOC said OEDA’s principal goal is to “use state of the art data and information science tools and techniques to collect, utilize and share data and information, efficiently leveraging data to reduce burden and costs while protecting individual and employer privacy, and promoting program transparency.” This initiative builds on EEOC’s October 2016 public meeting, “Big Data in the Workplace: Examining Implications for Equal Employment Opportunity Law,” by initiating a new and transformative force at the EEOC designed in part on serving valuable data and data products to the public.

OEDA is comprised of four divisions:

Business Operations and Organizational Performance Division – This Division will oversee business operations and, among other tasks, enhance the Office’s transparency and effectiveness. The Division will consist of a Business Operations Team and an Organizational Performance Team.

Data Development and Information Products Division – This Division’s tasks will include “develop[ing] information products” and being involved with data collection and survey methodology. It also will support EEOC charge-handling by linking EEOC charges with EEO-1 reports and providing analyses of EEOC charge data. The Division will consist of an Employer Data Team and an Information Products Team.

Information and Data Access Division – This Division is tasked with overseeing data governance and policy. It will provide research and information services in support of enforcement litigation efforts. The Division will consist of a Library and Information Services Team and a Data Policy and Access Team.

Data Analytics Division – This Division will provide systemic investigations analytical support and analytics on various data “to identify geographic, industry and other drivers of discrimination charges and emerging trends.” The Division will consist of an Investigative Analytics Team and an Enterprise Analytics Team.

The Office will take steps to assist employers by making valuable data and data-related products available to them. It also will equip agency investigators and enforcement officials with new data and analytical resources.

While access to new collections of data and data products from the EEOC will be of great value, employers should take note that these, and additional, resources also will be at the disposal of and potentially bolster enforcement efforts. Therefore, it is critical for employers to embrace the use of data to help analyze and manage the workplace and to better identify positive and negative trends.

The Jackson Lewis Data Analytics Group is available to answer any questions you may have about OEDA and help make sense of the data you already hold.

As if you don’t have enough on your plate already. You just heard in the lunchroom that your Chief Diversity Officer is making a presentation to a trade group on the company’s workforce demographics. Should you care?

An often-overlooked area of data analytics is in a company’s diversity office. I say “overlooked” because some diversity offices do not feel the need to seek the legal department’s advice when they embark on gathering data about, for example, the demographics of the company. Whether you have been asked for advice, or have decided to seek out the diversity office to see what data it is compiling, you should be aware of the types of analytics typically prepared — and the legal implications. Let’s look at a few examples.

Workforce Demographics: Most diversity offices track workforce demographics. These analyses typically include all job groups, like EEO-1 job categories, and may be as granular as job titles. Most include race/ethnicity and gender, some include age, and others may include such factors as veteran status, sexual orientation, and disability, in part depending on the availability and reliability of the data in the HRIS. The analyses may be snapshots for a specific date, or time-trends for comparison purposes: “Are we improving year-to-year in our representation of Hispanic females at the mid-level manager position?” Either way, they typically result in pie charts, graphs, and other visual aids clearly displaying percentages of representation by race/ethnicity, gender, etc., sometimes with red-shaded warning signs for “high-risk” (or “high-opportunity”) areas. Are you shuddering yet?

Source: Walmart Road to Inclusion 2017 Report

Benchmarking: Diversity offices want to be able to answer their CEO’s question, “So how are we doing compared with our competitors?” Benchmarking is a relatively easy task so long as comparative data is available. That, however, is the rub. Although some do, most companies do not publish their workforce demographic data. Some periodicals, like DiversityInc, publish limited data on those recognized annually as their “Top 50 Companies for Diversity.” The list may include companies in your industry, and even if it doesn’t, it can be considered a “best in class” benchmark. There are other sources as well, such as industry groups, and the EEOC.

Be aware of the types of analyses conducted by your diversity office. If the office is not asking for your oversight or assistance, offer it.

Adverse Impacts: Government contractors are accustomed to preparing annual Affirmative Action Plans that include statistical analyses of potential “adverse impacts” of various employment processes, such as recruiting/hiring, performance evaluations, terminations, compensation, and promotions. These analyses can be critical in proactively uncovering problems and addressing them. Particularly with non-government contractors, diversity and human resources offices often conduct such analyses to determine the fairness and equity of corporate policies and practices. Like other self-critical analyses, they present risks if disclosed.

So what should you do? First, be aware of the types of analyses conducted by your diversity office. If the office is not asking for your oversight or assistance, offer it.

Second, identify potential legal and reputational implications of these analyses, and address them. You know that such data may be useful to enforcement agencies (EEOC, OFCCP), plaintiffs’ attorneys, disgruntled employees, activist shareholders, and media critics. You should take steps to establish and preserve the attorney-client privilege for such analyses as you would any other area of legal advice. You can also probe the accuracy and reliability of the methods used and results. To the extent your diversity office wants to publish the results of analyses beyond a “control group,” ensure that this business decision considers the legal and reputational risks.

Of course, if the analytical results are all good news, that problem goes away. Feel free to let your diversity office brag about it, in the C-suite and beyond!

Pay equity between men and women – and among different races – has long been a concern for employers who want to ensure they are paying people according to job-related reasons, in compliance with anti-discrimination laws, and in a way that aligns with the organization’s practices and philosophies. In the midst of the #MeToo and Time’s Up movements, pay equity has taken a front seat in the public dialogue.

More than ever, employers want to take a close look at their compensation. Employers may seek to identify and prevent Title VII liability; proactively analyze location-specific data that OFCCP would analyze in an audit; or determine liability in a specific state in this current climate of patchwork state legislation, for example.

But employers are not always sure where to begin.

The good news is the information needed for a robust pay analysis is usually already available, in the form of data collected for other business reasons. Examples of data that may be used include:

  • Hire date
  • Time in current position
  • Tenure
  • Geographic location
  • Exempt status
  • Job title
  • Job family
  • Job level
  • Cost center
  • Department
  • Number of direct reports
  • Whether acquired from another company
  • Base salary
  • Incentive pay
  • Bonus pay
  • Other pay

The precise type of pay analyses that might be conducted using some or all of these variables depends upon the employer’s unique issues and goals. Multiple linear regressions – the gold standard of pay analyses – are possible where groupings are sufficiently large, but other statistical tests can be effective as well.

Regressions can help employers identify unexplained pay differences, and identify monetary adjustments to address any gaps.

As the Eagles readied to celebrate the franchise’s first Vince Lombardi trophy, an unlikely candidate basked in the glow of being declared the game’s Most Valuable Player. Surely it was Nick Foles who, on his way to upsetting one of the NFL’s elite franchises threw and caught a touchdown in the same big game, was the true MVP. But was he?

In the days leading up to the Super Bowl, the New York Times published an article about how the Eagles leveraged analytics to secure a Super Bowl berth. The team relied, in part, on probabilistic models that leveraged years of play data to calculate likely outcomes, given a specific set of circumstances. They found that while enumerating outcomes and optimizing for success, the models would, in many cases, recommend plays that bucked the common wisdom. Indeed, we saw the Eagles run plays and make decisions throughout the season that, to the outside observer, may have seemed mind-boggling, overly-aggressive, or risky. Of course, the outside observer did not have access to the play-by-play analytics. Yet, in many instances, these data-driven decisions produced favorable results. So it seems that analytics were the real MVP, right? Well, not entirely.

As we have written in the past, the most effective analytics platforms provide guidance and should never be solely relied upon by employers when making decisions. This analytics concept rings as true in football as it does in business. The New York Times article talks about how mathematical models can serve to defend a playmaking decision that defies traditional football logic. For example, why would any team go for it on fourth and one, deep in their own zone, during their first possession in overtime? What if the analytics suggested going for it was more likely to result in success? If it fails, well, the football pundits will have a lot to talk about.

Coaches and players weigh the analytics, examine the play conditions, and gauge on-field personnel’s ability to perform. In order words, the team uses analytics as a guide and, taking into account other “soft” variables and experience, makes a decision that is right for the team at that time. This same strategy leads to success in the business world. Modern companies hold a wealth of data that can be used to inform decisions with cutting edge analytics, but data-driven insights must be balanced with current business conditions in order to contribute to success. If this balancing act works on the grand stage of professional football, it can work for your organization.

Indeed, we may soon see a day when football stars raise the Super Bowl MVP trophy locked arm-in-arm with their data science team. Until then, congratulations, Mr. Foles.