As previously discussed in our past article and publication, New York City’s Local Law 144 was set to go into effect on January 1, 2023. However, agency enforcement is now postponed until April 15, 2023.


Responding to increased public and governmental scrutiny of the use of artificial intelligence, machine learning, and other technologies, New York City enacted Local Law 144.

  • Local Law 144 regulates NYC employers’ use of automated employment decision tools (AEDT) in hiring and employment-related processes, and intended to protect employees and job candidates from potential biases and discriminatory effects that may result from the use such tools.

Given the broad scope of the law’s terms and lack of clarity as to its various requirements, preparing to comply with the law by its original effective date, January 1, 2023, has been a leading concern of employers since the law was enacted in 2021.

After several months of waiting for regulatory guidance on this issue, on September 23, 2022, the New York Department of Consumer and Worker Protection’s (DCWP) published its proposed rules for the interpretation and application of Local Law 144. A public hearing for comments on the proposed rules was initially scheduled for October 24, 2022.

Public interest for the hearing was much larger than the DCWP anticipated, and it was rescheduled for November 4, 2022. The virtual hearing featured over 270 attendees and included live oral comments from more than a dozen participants.

Due to the high volume of comments requesting clarification and guidance regarding the proposed regulations, especially those concerning key compliance issues, on December 12, 2022, the DCWP announced that it will hold a second public hearing on January 23, 2023, and is postponing enforcement of the law until April 15, 2023.

This is welcome news for employers and employment agencies that hire or promote candidates within New York City with the assistance of AEDT – which based on the law’s definitions, arguably extends well beyond what many would consider an “automated” technology, underscoring the importance of clearer regulatory guidance.

Jackson Lewis attorneys are closely monitoring updates and changes to the effective date and guidance and are actively assisting employers with navigating the use of AEDT and how to comply with the forthcoming law.

On April 6, 2023, the New York City Department of Consumer and Worker Protection (“Department”) issued its Final Rules regarding automated employment decision tools (“AEDT”). As previously reported, New York City’s AEDT law, Local Law 144 of 2021, prohibits employers and employment agencies from using AEDT unless:

  • The tool has been subjected to a bias audit within a year of the tool being used or implemented;
  • Information about the bias audit is made publicly available; and,
  • Certain written notices have been provided to employees or job candidates.

            Following the multiple public hearings and comments on the Department’s proposed rules from September and December of 2022, the newly issued Final Rules now govern the Department’s interpretation, and enforcement, of the AEDT law. Changes from the December 2022 proposed rules that are now present in the Final Rules include:

  • Expansions to the scope of the AEDT law’s definition of “machine learning, statistical modeling, data analytics, or artificial intelligence;
  • Adding the requirement that the bias audit indicate the number of individuals assessed by the AEDT that were not included in the audit’s calculations because they fall within an “unknown category” (and that the number be listed in the bias audit results summary);
  •  Permitting independent audits to exclude a category that comprises less than 2% of the data being used for the bias audit’s impact ratio calculations;
  • Examples of a bias audit;
  • Guidance on when an employer or employment agency may rely on a bias audit conducted using historical data or test data, including data from other employers or employment agencies; and,
  • Additional language regarding the need to include, if applicable, the number of applicants in a category and the scoring rate of a category within the bias audit summary results.

            Now that the Final Rules have been issued, employers and employment agencies have the regulatory roadmap necessary to begin complying with the AEDT law when enforcement begins on July 5, 2023.

             Our Data Analytics and Privacy, Data, and Cybersecurity practices will continue to monitor regulatory and enforcement updates and are actively assisting employers and employment agencies with their compliance of the AEDT law and its Final Rules. If you and your organization have any questions or would like assistance with navigating the AEDT law or these Final Rules, please contact any member of our Data Analytics or Privacy, Data, and Cybersecurity practice groups.

In 2021, New York City enacted a measure that banned the use of Automated Employment Decision-Making Tools (“AEDT”) to (1) screen job candidates for employment, or (2) evaluate current employees for promotion, unless the tool has been subject to a “bias audit, conducted not more than one year prior to the use of the tool.” The law also required certain notifications regarding the use of AEDTs to be made to job seekers. The measure, known as Local Law 144 of 2021, was set to take effect on January 1, 2023.

In September 2022, the NYC Department of Consumer and Worker Protection (DCWP) issued guidance about the new ordinance and announced it was hosting an initial public hearing. Following the hearing, DWCP announced the law would not be enforced until April 1, 2023, due to the large number of public comments it received in response to prior hearings.  

Read the full article at Jackson Lewis’ Workplace Privacy, Data Management & Security Report

Continuing its initiative regarding the use of data, automated processes, and artificial intelligence (“AI”), the U.S. Equal Employment Opportunity Commission (“EEOC”) is holding a hearing on January 31, 2023 for examining the use of automated systems and AI in employment decisions.

This in-person hearing will begin at 10:00am EST on January 31 at the EEOC headquarters in Washington DC and will be livestreamed. There is also an option for listening via telephone.

The hearing includes a panel discussion on the civil rights implications of AI and other automated systems for U.S. employees and job candidates. Additionally, the hearing will explore ways in which these technologies might further the interests of diversity, inclusion, accessibility, and diversity.

People interested in learning more or wishing to register for the hearing can do so via this link.

Continuing its efforts towards becoming a 21st century data analytics agency, during the last week of October, 2021, U.S. Equal Employment Opportunity Commission (EEOC) Chair Charlotte A. Burrows announced a new EEOC initiative on artificial intelligence and algorithmic fairness.

This new initiative is aimed at ensuring that artificial intelligence and other emerging tools and technologies used by employers in hiring and in a multitude of other employment related decisions comply with federal civil rights laws.

By closely examining how these developing technologies are developed and utilized by employers, this initiative may lead to agency guidance for applicants, employees, employers, as well as technology developers and vendors alike in furtherance of the principles of fairness, consistency with federal equal employment opportunity laws and regulations, as well as the prevention and elimination of biases arising from the use of algorithms and artificial intelligence.

Specifically, with this new initiative the EEOC anticipates:

  • Creating a new internal working group responsible for coordinating the agency’s work on the initiative;
  • Gathering information regarding the development, adaptation, and impact of these technologies;
  • Conducting a series of informational listening sessions with key stakeholders about these emerging technologies and their actual/potential implications on matters of employment;
  • Identify practices and methodologies that seem likely to satisfy the EEOC’s objectives and the requirements of federal employment laws within its enforcement jurisdiction; and ultimately,
  • Provide technical assistance and guidance on using algorithms and artificial intelligence.

As a firm, Jackson Lewis and its Data Analytics Group strive to remain at the forefront of these technologies and their regulation and enforcement by state and federal agencies, and will be sure to follow this new initiative and the EEOC’s ongoing work regarding the use and implementation of data-driven applications in employment.

In contemporary litigation, “machine learning” and “predictive analytics” are phrases that are typically used in the context of e-discovery. However, as these technologies grow and evolve, so too will their application and utility in employment decisions and legal proceedings. At Jackson Lewis, we are committed to remaining at the forefront of these technologies and their potential uses both inside and outside of employment litigation.

While the majority of law firms and lawyers may use predictive coding or machine leaning features in some e-discovery platforms, our attorneys and skilled team of data scientists and coders are consistently expanding our already robust data and predictive analytics resources, and routinely use these technologies to assist clients with analyzing potential damages, value potential liability or jury verdicts by analyzing similar claims, develop litigation and negotiation strategies, as well as evaluate probabilities of success throughout all stages of litigation, arbitration, and other administrative investigations and proceedings. We also use these and other technologies, like artificial intelligence, to develop proprietary applications and algorithms.

Predictive analytics, machine learning, and AI raise a number of privacy and ethics concerns in society, but when utilized properly, can prove to be an invaluable asset to our clients both inside, and outside, the context of litigation. Indeed, employers are rapidly deploying these technologies across the employment spectrum, from identifying potential job candidates, conducting initial applicant screenings, tracking working time and attendance, identifying potential promotion candidates, as well as in workforce restructuring.

Employers definitely should embrace, and not fear, implementing these technologies, especially given their trajectory towards becoming essential to business in the modern era. However, like with any tool or asset, employers should be cognizant of how AI, machine learning and predictive analytics are designed, deployed, and monitored to avoid unintended biases and ensure compliance with applicable law. Taking a proactive and preventative approach to machine learning and predictive analytics is particularly important for developing defensible positions should the use of machine learning or predictive analytics be subjected to scrutiny in litigation, or otherwise.

Our attorneys and data scientists are experienced with helping clients navigate the legal and ethical waters of these technologies, and regularly aid in identifying, and remedying, potential compliance issues and unintended biases, and when necessary, defending their use of these technologies. We also utilize predictive analytics, in an ethical and legally compliant manner, in defending against a variety of employment claims.

Notwithstanding federal, state, and local privacy and cybersecurity laws that may apply, employers may generally use artificial intelligence, data analytics, and other software and technologies to track remote workers.

The COVID-19 pandemic has resulted in, if not required, the vast majority of businesses to adopt remote work and virtual workplaces as a means of operational necessity, and to promote the health and safety of its employees and clientele. With these shifts, employers have expeditiously looked to new and alternative means for tracking employee performance, monitoring productivity, and ensuring the security of their networks and computer systems. Artificial intelligence, machine learning, and data analytics are useful tools that employers can, and should, utilize to accomplish these, and other employment-related objectives.

To illustrate, a standard time-keeping application can serve as a virtual substitute for a physical timeclock. However, there are other methods for tracking employee performance, activity, and working time that can and should be considered if no other reason but to serve as a way to verify a remote employee’s time report. Employers that use a virtual private network (“VPN”) or remote/cloud computing can cross-reference a remote worker’s network connection and activity data to confirm the start and stop times reported on a separate timeclock application. Artificial intelligence and machine learning can be deployed to passively acquire login and other activity data that can help employers track employee work activity, and identify patterns, abnormalities, etc. on an enterprise-wide basis.

Employers and their IT professionals should take a holistic approach when evaluating available technologies and data sources to identify the systems, data, and metrics that are most appropriate, and effective, for accomplishing their needs. By properly designing and monitoring these technologies, employers can successfully minimize the potential liability stemming from a myriad of issues associated with remote work and ensure compliance with company policies. For example, potential liability stemming from irregular work schedules, overtime issues, and violations of work and leave-related restrictions, including prohibitions on non-exempt employees sending/reading work-related emails after clocking out or outside working hours.

Of course, businesses should also consult with legal counsel to ensure that their employee monitoring and tracking abilities are compliant with applicable law, and do not result in unintended biases that could potentially cause a disparate impact on protected classifications.

On February 1, 2021, the Equal Employment Opportunity Commission (EEOC) launched a new software system to receive and process Freedom of Information Act (FOIA) requests and objections.  FOIA allows individuals the right to request access to federal government records.

Individuals looking to make FOIA requests can now initiate the requests using the new software system.  Any requests submitted prior to February 1 will be able to access the status of their requests via the old Public Access Link until March 12, 2021 but will then need to access all requests using the new software system.

While the new software system will make it easier to submit FOIA requests and track the status of such requests, the EEOC will still accept FOIA requests and objections via mail, email, and fax.  Once requests are received via mail, email, or fax, the EEOC will enter the requests into the new software system for processing and monitoring.

The EEOC FOIA software system can be accessed here.

On January 21, 2021, the Office of Federal Contract Compliance Programs (OFCCP) announced via its leadership team webpage, that Jenny Yang, former EEOC Chair during the Obama administration, was selected as OFCCP Director.

During Jenny Yang’s time with the EEOC, she was a supporter of the Component 2 section of the EEO-1 reports, which required contractors to submit pay and hours worked data on their employees for 2017 and 2018. In addition to advocating for the government collection of pay data, Jenny Yang also worked towards expanding protections under federal anti-discrimination law for the LGBTQ+ community and established a Select Task Force on the Study of Harassment in the Workplace.

Based on her work at EEOC, we can expect Director Yang to prioritize pay equity and sexual orientation and gender identity rights. Jenny Yang takes over as OFCCP Director following Craig Leen, who implemented the four “pillars” of the OFCCP, Transparency, Certainty, Efficiency, and Recognition, allowing federal contractors to rely on the “pillars” in interactions with the OFCCP. It will be interesting to see if Director Yang continues using the “four pillars” during her time at the OFCCP.

On January 19, 2021, the U.S. Equal Employment Opportunity Commission (EEOC) announced that Kimberly S.L. Essary has been appointed as Deputy Chief Data Officer within the agency’s Office of Enterprise Data and Analytics (OEDA).

Essary’s appointment expands OEDA’s existing executive leadership, further illustrating the EEOC and OEDA’s mission to build a 21st century data and analytics organization. In this senior executive role, Essary is expected to manage the full range of program activities within OEDA.

First joining the EEOC as a career attorney-advisor in 2009, Essary’s appointment is a testament to her years of experience serving in high-level roles at the EEOC and other related governmental departments. Essary has served as OEDA’s Deputy Director and Senior Counsel since 2018 and played an integral part in its development and creation, as well as the creation of EEOC’s Data Governance Board. During her tenure, Essary also has served as a federal data fellow in the U.S. government’s Federal Data Strategy Team – a group of cross-disciplined data professionals tasked with developing the government-wide Federal Data Strategy.

Speaking on her appointment, Essary said, “I am honored to have the opportunity to serve in this new role. I look forward to building upon the agency’s work in recent years to modernize our data and analytics capabilities and to better serve the American people through data-driven decision-making.”