New Freedom of Information Act (FOIA) Software Launched By EEOC

On February 1, 2021, the Equal Employment Opportunity Commission (EEOC) launched a new software system to receive and process Freedom of Information Act (FOIA) requests and objections.  FOIA allows individuals the right to request access to federal government records.

Individuals looking to make FOIA requests can now initiate the requests using the new software system.  Any requests submitted prior to February 1 will be able to access the status of their requests via the old Public Access Link until March 12, 2021 but will then need to access all requests using the new software system.

While the new software system will make it easier to submit FOIA requests and track the status of such requests, the EEOC will still accept FOIA requests and objections via mail, email, and fax.  Once requests are received via mail, email, or fax, the EEOC will enter the requests into the new software system for processing and monitoring.

The EEOC FOIA software system can be accessed here.

Biden Appoints Jenny Yang as Director of Office of Federal Contract Compliance Programs

On January 21, 2021, the Office of Federal Contract Compliance Programs (OFCCP) announced via its leadership team webpage, that Jenny Yang, former EEOC Chair during the Obama administration, was selected as OFCCP Director.

During Jenny Yang’s time with the EEOC, she was a supporter of the Component 2 section of the EEO-1 reports, which required contractors to submit pay and hours worked data on their employees for 2017 and 2018. In addition to advocating for the government collection of pay data, Jenny Yang also worked towards expanding protections under federal anti-discrimination law for the LGBTQ+ community and established a Select Task Force on the Study of Harassment in the Workplace.

Based on her work at EEOC, we can expect Director Yang to prioritize pay equity and sexual orientation and gender identity rights. Jenny Yang takes over as OFCCP Director following Craig Leen, who implemented the four “pillars” of the OFCCP, Transparency, Certainty, Efficiency, and Recognition, allowing federal contractors to rely on the “pillars” in interactions with the OFCCP. It will be interesting to see if Director Yang continues using the “four pillars” during her time at the OFCCP.

New Deputy Chief Data Officer at EEOC Office of Enterprise Data and Analytics

On January 19, 2021, the U.S. Equal Employment Opportunity Commission (EEOC) announced that Kimberly S.L. Essary has been appointed as Deputy Chief Data Officer within the agency’s Office of Enterprise Data and Analytics (OEDA).

Essary’s appointment expands OEDA’s existing executive leadership, further illustrating the EEOC and OEDA’s mission to build a 21st century data and analytics organization. In this senior executive role, Essary is expected to manage the full range of program activities within OEDA.

First joining the EEOC as a career attorney-advisor in 2009, Essary’s appointment is a testament to her years of experience serving in high-level roles at the EEOC and other related governmental departments. Essary has served as OEDA’s Deputy Director and Senior Counsel since 2018 and played an integral part in its development and creation, as well as the creation of EEOC’s Data Governance Board. During her tenure, Essary also has served as a federal data fellow in the U.S. government’s Federal Data Strategy Team – a group of cross-disciplined data professionals tasked with developing the government-wide Federal Data Strategy.

Speaking on her appointment, Essary said, “I am honored to have the opportunity to serve in this new role. I look forward to building upon the agency’s work in recent years to modernize our data and analytics capabilities and to better serve the American people through data-driven decision-making.”

Agency Takes Step to Becoming ’21st Century Data and Analytics Organization’ with EEOC Explore

The launch of “EEOC Explore” is the U.S. Equal Employment Opportunity Commission’s (EEOC) latest step toward its vision to “build a 21st century data and analytics organization.”

In December 2018, the EEOC created the Office of Enterprise Data and Analytics (OEDA). Since its inception, the OEDA has been working on modernizing EEO data availability. As part of its December 2, 2020 launch of EEOC Explore, the agency demonstrated the features of its new tool, which furthers OEDA’s vision by working to modernize EEO data availability.

A data query and mapping tool, EEOC Explore aggregates publicly available EEO-1 data (currently limited to EEO-1 data sets from 2017 and 2018) into a series of interactive dashboards. The OEDA is expected to further develop and expand the size, scope, and functionality of EEOC Explore as its work continues.

EEOC Explore visualizes employment characteristics and information without identifying any employer or employee confidential information, protecting the EEO-1 data that was previously submitted by employers. The underlying EEO-1 data is then presented to users through five primary dashboards:

(i) job categories by gender;

(ii) race/ethnicity by gender;

(iii) industry;

(iv) state trends by year; and

(v) comparison by state.

Once a user selects a dashboard, they can filter the applicable data by year, geography, sex, race/ethnicity, job category, NAICS code (2-digit), and NAICS code (3-digit) to view more granular data. Within each dashboard, users can view the data’s aggregated totals by category and use filters to break down the information by race, gender and job category, or focus in on one specific race, gender, and job category.

During the demonstration, the OEDA used the example of comparing employment in the finance industry between Silicon Valley and the Atlanta Metro area, starting with the Industry dashboard, then filtered by the year (2018), the Silicon Valley and Atlanta Metro geographic areas, and the finance and insurance NAICS codes.

EEOC Explore is a useful, ready-made tool, but employers understandably may require context and more in-depth analysis to meet the legal and operational needs of their organization.

The Jackson Lewis Data Analytics Group is available to answer any questions about OEDA and the use and application(s) of EEO-1 information, and help make sense of the data you already hold.

Don’t Forget to Evaluate the Selections in Reductions in Force During COVID-19 Pandemic

The COVID-19 pandemic has affected the workplace in ways we could not have imagined just a few months ago. Indeed, the economic impact caused by the COVID-19-related shutdowns has prompted many employers to reevaluate how to conduct business and the number of employees they need. Consequently, many employers have been compelled to consider reductions in force. While not limited to times of pandemic, the basic principles apply.

Commonly, employers approach the exercise with a plan in mind: Which business lines are affected? How much payroll do we need to shed? What are the criteria for selection? Which employees will be selected for reduction? Who will make the selections? And more …. In addition, employers should consider the impact of their selections on groups of individuals of a certain demographic.

Considering Title VII of the Civil Rights Act of 1964, for example, employers should ask whether the rate at which females are selected for reduction is greater than the rate at which males are selected? How are minority employees faring against non-minority employees? And, considering the Age Discrimination in Employment Act, what about older employees as compared to younger employees? Additionally, are the differences in rates statistically significant? A disparate impact analysis can help answer these questions and possibly substantially mitigate risk for employers.

What if there is statistically significant disparate impact against a particular demographic? Confirm the criteria used to select the individuals for reduction are “consistent with business necessity” under Title VII. If a criterion does not meet the standard, revisit the selection(s).

Remember, intent need not come into the equation as these analyses examine the impact facially neutral selection criteria may be having on certain demographic groups. Of course, protected characteristics generally should not be included among the criteria being used to select individuals for selection. Be mindful of the importance and applicability of the attorney-client privilege when conducting these analyses and consider retaining a law firm to help.

FTC’s Tips on Using Artificial Intelligence and Algorithms

Artificial intelligence (AI) technology that uses algorithms to assist in decision-making offers tremendous opportunity to make predictions and evaluate “big data.” The Federal Trade Commission (FTC), on April 8, 2020, provided reminders in its Tips and Advice blog post, Using Artificial Intelligence and Algorithms.

This is not the first time the FTC has focused on data analytics. In 2016, it issued a “Big Data” Report. See here.

AI technology may appear objective and unbiased, but the FTC warns of the potential for unfair or discriminatory outcomes or the perpetuation of existing socioeconomic disparities. For example, the FTC pointed out, a well-intentioned algorithm may be used for a positive decision, but the outcome may unintentionally disproportionately affect a particular minority group.

The FTC does not want consumers to be misled. It provided the following example: “If a company’s use of doppelgängers – whether a fake dating profile, phony follower, deepfakes, or an AI chatbot – misleads consumers, that company could face an FTC enforcement action.”

Businesses obtaining AI data from a third-party consumer reporting agency (CRA) and making decisions on that have particular obligations under state and federal Fair Credit Reporting Act (FCRA) laws. Under FCRA, a vendor that “assembles” consumer information to automate decision-making about eligibility for credit, employment, insurance, housing, or similar benefits and transactions may be a “consumer reporting agency.” An employer relying on automated decisions based on information from a third-party vendor is the user of that information. As the user, the business must provide consumers an “adverse action notice” required by FCRA if it takes an adverse action against the consumer. The content of the notice must be appropriate to the adverse action, and may consist of a copy of the “consumer report” containing AI information, the federal summary of rights, and other information. The vendor that is the CRA has an obligation to implement reasonable procedures to ensure the maximum possible accuracy of consumer reports and provide consumers with access to their own information, along with the ability to correct any errors. The FTC is seeking transparency and the ability to provide well-explained AI decision-making if the consumer asks.

Takeaways for Employers

  • Carefully review use of AI to ensure it does not result in discrimination. According to the FTC, for credit purposes, use of an algorithm such as a zip code could result in a disparate impact on a particular protected group.
  • Accuracy and integrity of data is key.
  • Validation of AI models is important to minimizing risk. Post-validation monitoring and periodic re-validation is important as well.
  • Review whether federal and state FCRA laws apply.

Continue self-monitoring by asking:

  • How representative is your data set?
  • Does your data model account for biases?
  • How accurate are your predictions based on big data?
  • Does your reliance on big data raise ethical or fairness concerns?

The FTC’s message: use AI, but proceed with accountability and integrity. Jackson Lewis’ Data Analytics Group is available to discuss questions about validation and data integrity.

Analytics at Large: COVID-19 Data vs. HR Data — What’s the Difference?

By: Samantha Rhoads & Michael Cortes[1]

The COVID-19 pandemic has put a spotlight on statistical terms often unfamiliar to anyone but such professionals as statisticians and data scientists. Terminology such as infection rates, “flattening the curve,” and related statistical information are now being used as slogans and hashtags. This post offers a brief explanation of data analytics and the related math for a better understanding of statistics and data.

Being inundated with information and government-provided mathematical models and projections makes it difficult for citizens to know what statistics to trust. Statistics helps make sense of data. The accuracy of the insights, though, is limited by the quality and accuracy of the data itself. From one data source, the COVID-19 fatality rate calculated may be 4.28%. From another source, the rate is actually 0.66%. The fundamental reasons for these different numbers are:

  1. The numbers are based on different datasets that were collected using different methods.
  2. The statistical models applied to the data are based on different assumptions.

How do you know what to rely upon? This is where the human element comes into the picture. When given data and statistics, be skeptical. Apply common sense and trust your own thinking, and find other research done from other sources to get a more complete picture. Understand the assumptions being made and how the data was collected.

People can mentally process trends in the data in order to take precautions and understand how the world is being shaped. Looking directly at the numbers is vital in understanding COVID-19’s path, and applying good sense and judgment is critical.

While the data around COVID-19 is a critical issue now, the same data and statistical concepts apply to data sets in various areas. For example, the same critical thinking should be applied against workplace trends and data as is being done with the COVID-19 trends. Work lives are more unclear at this time, which means defining patterns and understanding the nuances in the data is critical.

Companies’ employee data should be harnessed and explained. This means, for example, if a company wants to understand the diversity of its workforce, it needs to be careful about how it is computing those numbers. Pay special attention to the way the data is collected and what the variables mean. It is easy to be led astray if the company is computing the wrong numbers or interpreting the results incorrectly.

While data can be used to make inferences, there are caveats. Numbers “don’t lie,” but they may be taken out of context. The assumptions around the calculations and the data usually are not fully understood. Keep the reliability of the data in mind and ensure the circumstances of a pattern are considered before purporting it as something meaningful.


[1] Samantha Rhoads and Michael Cortes are Data Scientists in the Firm’s Data Analytics Group.

“Predictive Hiring Tools” – Buyer Beware

As employers seek to make increasingly efficient and “better” hiring decisions, avoid biases, and increase workforce diversity, they are turning to, or considering, a growing range of technological tools. Essentially, these tools help employers efficiently identify qualified candidates, narrow the pool of job seekers, and predict who may be the “best” hire. As is often the case, however, technological advances may outpace our ability to keep up, and our ability to predict and understand the legal implications of such technological advances. In particular, such tools may be facially neutral but have an adverse impact on particular groups in violation of Title VII of the Civil Rights Act, Executive Order 11246, as well as other federal and state anti-discrimination laws.

In a recent article, Upturn – a non-profit organization that “promotes equity and justice in the design, governance, and use of digital technology” – evaluates the full range of “predictive hiring tools” with a view towards exploring “how predictive tools affect equity throughout the entire hiring process.”[1] The Upturn article ominously concludes that, “without active measures to mitigate them, bias will arise in predictive hiring tools by default.”

As with any type of “test” or selection tool, we agree employers should proactively understand how such tests and tools work, and what unintentional consequences they may produce.

Upturn points out that vendors have developed predictive tools for every stage of the hiring process: from sourcing to screening to interviewing to selection, and beyond hiring to performance evaluations. However, to appreciate the associated risks, Upturn points out employers should first understand how biases might be inherent in these tools – even those tools that vendors tout as removing bias from the hiring process.

When we think of bias in the employment context, we naturally think of “interpersonal bias” – an explicit or implicit (cognitive) bias an individual may have for/against characteristics of other individuals, including race, gender, color and other legally protected, immutable characteristics. In summary, however, even where predictive tools take steps to eliminate interpersonal bias, predictive tools may unknowingly create, perpetuate or exacerbate existing institutional and/or systemic biases. As the Upturn authors suggest,

Without active measures to mitigate them, biases will arise in predictive hiring tools by default. But predictive tools could also be turned in the other direction, offering employers the opportunity to look inward and adjust their own past behavior and assumptions. This insight could also help inform data and design choices for digital hiring tools that ensure they promote diversity and equity goals, rather than detract from them. Armed with a deeper understanding of the forces that may have shaped prior hiring decisions, new technologies, coupled with affirmative techniques to break entrenched patterns, could make employers more effective allies in promoting equity at scale.

While the technology may be new, efforts to address adverse impact in employment selection decisions is not. Title VII deems any employment selection procedure that has an adverse impact on the hiring of any race, sex, or ethnic group to be illegally discriminatory, unless the procedure has been “validated” in accordance with the Uniform Guidelines on Employment Procedures (UGESP), and no less-impactful procedure exists as an alternative. In general, a selection procedure or test is valid if it tests for job seeker attributes that are highly correlated with success in a particular job. The term “selection procedure” includes any procedure used to narrow the pool of job seekers for hire – from eligibility questions, to resume screens to interviews. Thus, most predictive hiring tools are subject to the UGESP.

However, the Upturn article points out that validation of predictive hiring tools may not be enough to identify tools that in fact fall prey to institutional and systemic biases. Employers and vendors may also need a more in-depth understanding and analysis of a tool, and how it works for any given employer.

While employees, plaintiff attorneys and the EEOC may not, yet, be fully active in this arena, they are aware of the issues. EEOC began publically addressing the implications of this issue in 2016. Moreover,   as discussed in an earlier blog post, EEOC recently launched an Office of Enterprise Data and Analytics with the capability of providing analytical support for systemic discrimination investigations.

Of course, regardless of legal implications, a seemingly effective hiring tool that is, without our knowledge, creating or perpetuating unintentional biases in your company’s hiring process may prove to be counter-productive.

What steps should employers take now to protect themselves from legal risk? The Upturn article provides a set of specific, technical “Guiding Questions” the authors found themselves “needing to answer…before we could even begin to think about the equity implications of a given tool.”

Of course, the first question to ask is whether your company is using or considering the use of any predictive tools. If so, our Data Analytics group may be able to provide assistance.   The Data Analytics Group is a multi-disciplinary team of lawyers, statisticians, data scientists, and analysts with decades of experience managing the interplay of data analytics and the law. For more information, please contact your Jackson Lewis attorney or Eric Felsberg, the National Director of the JL Data Analytics Group.

[1] Help Wanted – An Examination of Hiring Algorithms, Equity and Bias, Miranda Bogen and Aaron Rieke (Upturn, December 2018).

EEOC Launches Office of Enterprise Data and Analytics

The Equal Employment Opportunity Commission (EEOC) has established the Office of Enterprise Data and Analytics (OEDA) to “provide [their] customers timely, accurate, and bias-free data and information to prevent and remedy unlawful employment discrimination, and improve organizational performance.” EEOC Director and Chief Data Officer Samuel C. “Chris” Haffer leads the OEDA.

This is an exciting and forward-looking development. Jackson Lewis’ Data Analytics Group was invited to EEOC’s headquarters in Washington, D.C. to attend a listening session about the new Office.

Employers have access to a wealth of data that, when paired with powerful analytical tools, can help them more effectively manage the workplace. In order to remain competitive, reduce workplace management costs, and mitigate workplace-related legal risk, among other things, it is critical for employers to leverage their own data in combination with external sources.

The launch of the OEDA should prompt employers to reconsider waiting to leverage data and analytics in managing their workplace. Using data and analytics in the workplace is not a passing fad.

With a vision to “build a 21st century data and analytics organization,” the EEOC said OEDA’s principal goal is to “use state of the art data and information science tools and techniques to collect, utilize and share data and information, efficiently leveraging data to reduce burden and costs while protecting individual and employer privacy, and promoting program transparency.” This initiative builds on EEOC’s October 2016 public meeting, “Big Data in the Workplace: Examining Implications for Equal Employment Opportunity Law,” by initiating a new and transformative force at the EEOC designed in part on serving valuable data and data products to the public.

OEDA is comprised of four divisions:

Business Operations and Organizational Performance Division – This Division will oversee business operations and, among other tasks, enhance the Office’s transparency and effectiveness. The Division will consist of a Business Operations Team and an Organizational Performance Team.

Data Development and Information Products Division – This Division’s tasks will include “develop[ing] information products” and being involved with data collection and survey methodology. It also will support EEOC charge-handling by linking EEOC charges with EEO-1 reports and providing analyses of EEOC charge data. The Division will consist of an Employer Data Team and an Information Products Team.

Information and Data Access Division – This Division is tasked with overseeing data governance and policy. It will provide research and information services in support of enforcement litigation efforts. The Division will consist of a Library and Information Services Team and a Data Policy and Access Team.

Data Analytics Division – This Division will provide systemic investigations analytical support and analytics on various data “to identify geographic, industry and other drivers of discrimination charges and emerging trends.” The Division will consist of an Investigative Analytics Team and an Enterprise Analytics Team.

The Office will take steps to assist employers by making valuable data and data-related products available to them. It also will equip agency investigators and enforcement officials with new data and analytical resources.

While access to new collections of data and data products from the EEOC will be of great value, employers should take note that these, and additional, resources also will be at the disposal of and potentially bolster enforcement efforts. Therefore, it is critical for employers to embrace the use of data to help analyze and manage the workplace and to better identify positive and negative trends.

The Jackson Lewis Data Analytics Group is available to answer any questions you may have about OEDA and help make sense of the data you already hold.