Home > Resources > AI and leaves administration: Risks and rewards
Brightmine background

AI and leaves administration: Risks and rewards

Brightmine resources graphic

Published: September 24, 2024 | by Melissa Stein, Legal Editor at Brightmine

Given the challenges of juggling federal, state and local leave laws, leaves administration is ripe for AI support. While employers know that AI is changing the way we work, not everyone has gotten on board with using AI to improve efficiencies in HR processes like leaves. Gartner reports that “38% of HR leaders have explored or implemented AI solutions” to improve organizational efficiency.

Whether your HR team is begging you for AI support for leaves administration or if you already have an AI system in place, it is essential to understand the risks and rewards involved.

Leverage AI to save time and money

AI can support employers with leaves administration in many ways, including reducing bias, inconsistencies and mistakes. AI doesn’t hold a grudge against a particular employee, cannot follow one rule for one employee’s request but not another’s or make a fluke mistake because it’s had a long day, potentially costing the company significant civil penalties. Instead, every decision it makes is based on the same set of rules implanted via machine learning.

Plus, relying on AI in the leaves administration process gives HR teams their time back to focus on tasks that require human analysis and decision-making. As HR teams juggle an increasingly complex patchwork of state and local leave laws, smart companies are leaping at the chance to lighten the load of leaves administration with AI.

Know AI can be risky

At the same time, relying too heavily on AI for leaves administration puts an organization at risk.

If an organization either (1) uses AI for leave process steps meant to be subjective or (2) does not appropriately oversee the AI function, it risks civil penalties and harm to their hard-earned reputation.

Government agencies are keeping an eye out for organizations that make these mistakes. The Department of Labor issued a Field Assistance Bulletin warning employers of the risk involved in using AI under the federal Family and Medical Leave Act (FMLA). If an employer does not implement appropriate guardrails and oversight when it comes to administering the FMLA, the DOL says, the employer will be held responsible for any issues caused by the automated system.

HR professionals know that the FMLA process can be grueling. The lengthy federal law has nuanced requirements at every step (employee request, leave period, employee’s return). Surely AI can make this process more efficient for HR teams. But steps that require subjective decision-making or instruct an employer to act “reasonably” must be based on human analysis, not black-and-white rules.

For example, it may seem simple enough to use AI to determine employee eligibility for FMLA leave, as the law specifically states the eligibility requirements. But the law’s requirement that employers act “reasonably” in gathering this information is a gray area incompatible with machine learning. To determine eligibility, an employer may reasonably clarify or request additional information beyond what the employee initially provides, only to the extent that allows an employer to determine eligibility.

Without oversight, AI could request information beyond what is “reasonable,” violating a critical aspect of the employer’s legal obligations. Because the FMLA allows an employer to deny leave if an employee does not cooperate with the employer’s reasonable eligibility inquiries, AI could unlawfully deny an employee’s right to FMLA leave if it mistakenly concludes the information is insufficient or finds an employee is uncooperative, without considering the surrounding circumstances.

Similar concerns arise when it comes to medical certifications for FMLA leave based on a serious health condition. An AI system could unlawfully request detailed medical information beyond what is necessary or hold an employee to unreasonable time limits for returning information. Plus, an employer must ensure privacy guardrails that maintain FMLA process documentation as confidential, whether or not AI is involved.

Follow the rules to reap the rewards

HR professionals know that leave processes, particularly under the FMLA and state and local family and medical leave laws, create opportunities for risk at every step. Assigning AI responsibility for reasonable decision-making may cause more risk than the time-saving tool is worth.

Simply put, employers that use AI for any leaves administration still must follow the law. By carefully implementing guardrails and ensuring human oversight, AI can be the timesaver leaves administrators dream about.

Still unsure how to use AI in leaves administration to your advantage, not your detriment?

Source: Brightmine