As AI mandates spread, employment attorneys warn a new wave of religious accommodation requests is coming.
An employee tells HR they refuse to use any AI software as part of their job. They’re a devout Christian, they say, and believe that using generative AI for creative or analytical work violates their faith; humans, they argue, are made in God’s image to be the sole creators and delegating that to a machine is a form of idolatry.
Some HR professionals might smirk. A few might push back. But employment attorneys say that reaction, however instinctive, could cost an employer hundreds of thousands of dollars.
James Paul, an employment attorney at Ogletree Deakins, has been advising employers on employment law for nearly 20 years. For most of that time, a religious accommodation request was a rarity.
“Usually, it’d be maybe one issue every month, or maybe one every two months,” said Paul, a shareholder at Ogletree Deakins. That changed after the pandemic, and now AI is accelerating the trend further.
“Now I’m seeing three or four a week. And I’ve had three situations in the last two months where employees have articulated objections to the use of certain technologies in the workplace.”
READ MORE: Your employees don't have AI anxiety. They're grieving
David Miklas, a Florida-based employment attorney with 27 years of experience advising HR leaders, sees the same shift coming.
“There are a lot of religious people out there and sometimes their beliefs aren’t traditional, and that doesn’t give them any less protection,” he says.
For employers pressing ahead with AI mandates, that combination of rising volume and broad legal protection is one they can no longer afford to ignore.
The belief doesn't have to be mainstream
HR leaders may be tempted to dismiss religious objections to AI as theologically far-fetched, but neither attorney recommends it. While no major religion has issued a blanket prohibition on AI use, what any organized faith says about the technology is largely irrelevant under the law.
Under Title VII of the Civil Rights Act of 1964, an employee’s religious belief does not need to be endorsed by any organized faith tradition to be legally protected. It simply has to be sincerely held.
Any individual can take a religious foundation and add their personal spin to it and have a sincerely held religious belief that slightly modifies whatever the official organized religion says," Miklas says. Courts and regulators take that seriously, and employers who don't are taking on real risk.
Paul points to the same legal reality. "There is no organized religion that has banned AI or even said that it's bad in and of itself. But that doesn't stop any individual employee from saying, yes, but I take it to a higher level,” he said.
Beyond the idolatry argument, Miklas has identified several other grounds on which workers might legitimately object to using AI. An employee might refuse to work on projects that incorporate AI-generated content on the basis that it violates their faith’s standards around truth and integrity. Others may object to AI taking over tasks they believe require human judgment and soul, viewing automation as a devaluation of human life. Each of these, if sincerely held, could trigger an employer’s legal obligations.
The first mistake employers make
According to Miklas, a Title VII religious accommodation claim rests on three elements: the employee holds a sincerely held religious belief that conflicts with a workplace requirement; they notify their employer of that conflict; and the employer fails to accommodate them.
“The first element is never where the business wants to spend its time and energy fighting,” he said. “Whether an employee has a sincerely held religious belief is almost always a losing argument for the business. It’s safer to just assume that what they’re saying is adequate and then focus on what the business can do to try to accommodate that request.”
He points to the landmark EEOC v. Consol Energy case as a cautionary tale about misplaced effort. In that case, an evangelical Christian coal miner was awarded nearly $600,000 after his employer refused to offer an alternative to a biometric hand scanner he believed was linked to the “Mark of the Beast.” The employer’s mistake, Miklas argues, wasn’t the technology decision; it was the response.
“They wasted all their time and energy trying to quote scripture,” he said. “The energy should have been focused on: before we implemented the scanner, how did people clock in and out? This guy asked, ‘Can I just do that again?’ And that would have been a very easy accommodation. They did it for 20, 30, 40 years beforehand, just fine.”
That logic translates directly to AI. For HR leaders navigating accommodation requests tied to new technology, Miklas recommends a straightforward starting point: look back.
“More than likely, any job that exists now was being done two or three years ago without AI. So, the reasonable accommodation would be to go back and figure out what we did back then,” he said.
How to handle it, and how not to
Once a religious objection to AI is raised, employers are legally obligated to engage in an interactive process; the same back-and-forth framework that applies under the ADA and the Pregnant Workers Fairness Act applies here too.
That process may involve exempting the employee from AI-related tasks, reassigning minor functions to a colleague, or transferring them to a role where the conflict doesn’t arise.
READ MORE: AI adoption raises psychosocial risks at work, ILO warns
Paul recommends that HR professionals take the request seriously, have the employee articulate the specific nature of their belief, consult managers to understand whether AI use is truly essential to the role, then evaluate what accommodation is feasible.
“HR oftentimes doesn’t understand exactly what is being required and why,” Paul said. “That step is critical. HR needs to educate themselves, talk to the managers and supervisors involved, and figure out what exactly is required and how much of a problem it would actually be.”
Adding urgency to the analysis is the Supreme Court’s 2023 ruling in Groff v. DeJoy, which significantly raised the bar for employers seeking to deny accommodation. Before that decision, a de minimis cost, even as little as $30 of overtime, could justify a refusal. Now, an employer must demonstrate a substantial increased cost in relation to the conduct of its particular business.
“You usually have to pull out your calculator,” Miklas said. The size of the business matters too; what constitutes an undue hardship for a 16-person business may look very different for a company with 500 employees.
There is, however, a twist that HR professionals managing AI tools in their organizations may not see coming. Miklas warns that using AI to brainstorm accommodation options could itself create legal exposure.
“I would be leery to tell an HR person to use AI unless they’re going to adopt the answers, because it could be discoverable,” he said. “Imagine going in front of a jury and that AI prompt is put up on the screen, with AI coming up with five possible accommodations, and the employer offered none of them. That would be devastating.”
The first big case is coming
The first significant court ruling on AI-specific religious accommodation is coming; both attorneys agree it’s just a matter of when, not if.
“We’re still waiting for an actual case or decision,” Paul said, “but there will be one, because these objections and requests are coming in at a fairly consistent and increasing pace.”
The time to prepare is now, both attorneys say, well before the first request lands on your desk. That means auditing accommodation policies, ensuring intake forms exist, training managers not to dismiss unusual beliefs out of hand, and documenting every step of the process consistently.
The good news, Miklas says, is that solutions are usually there if you look for them.
“My experience in 27 years doing this is that usually, if you brainstorm, you can come up with multiple possible reasonable accommodations. The focus should always be on how can we accommodate them, not how can we try not to,” he said.