EU AI Act

That assumption creates risk. Because under the EU AI Act, where your company is located doesn’t matter. What matters is which markets your products and services are in and whether your organization uses AI systems. 

Selling Into the EU?  

If you use AI tools in the production of services or products that enter the European Economic Area (EEA) marketplace, the EU AI Act may apply, even if you’re based in the U.S. It all depends on how you use those tools and which risk level the Act categorizes them as. 

AI systems are now embedded in everyday HR workflows, from screening resumes and supporting hiring decisions to shaping performance evaluations and workforce analytics. Even general-purpose AI tools used to draft feedback or employment-related content can fall into scope. The Act may categorize these types of uses as intermediate or high-risk use, which requires compliance with certain obligations. 

The Deadline to Know 

Most obligations under the Act are already in force. 

August 2, 2026, is the date for the next major enforcement milestone under the Act. 

Compliance obligations for high-risk AI systems, such as those involved in certain HR decision making, enter into force. 

There are many compliance obligations under the Act, but for HR, the two most important are these two: 

  • Employees who use AI tools must be “AI literate” 
  • High-risk AI systems must have meaningful human oversight 

For HR, that translates into something more familiar: training employees to ensure they understand expectations, applying clear guardrails, and being able to demonstrate that both are in place. 

Many organizations underestimate how long it takes to identify AI use, train employees, and document compliance until they try to do it. 

Why HR Is on the Hook 

When regulators evaluate AI use, they look at whether your people are prepared to use AI tools responsibly. 

That shows up in very practical ways: what employees were trained on, whether expectations were clear, how decisions involving AI were reviewed, and what proof exists if questions arise later. 

The law specifically emphasizes that employees need to understand how AI should and should not be used, along with appropriate human oversight in higher-risk decisions  

What “AI Literacy” Actually Means 

Most organizations don’t lack awareness of AI risk. What they lack is clarity on execution. 

Who needs training?  
What should it include?  
How do you tailor it across roles?  
How do you document it in a way that holds up under scrutiny? 

These are operational questions, and they sit squarely with HR and compliance. 

AI literacy doesn’t mean turning employees into technical experts. It means helping them recognize when AI is influencing a decision, understanding where risk or bias can enter the process, and knowing when human judgment needs to take over. 

At its core, it’s about giving people the confidence to make better decisions, not just follow a system. 

Why Waiting Creates Risk 

August 2026 may feel far away, but preparing for it isn’t a quick exercise. 

Organizations need time to identify where AI is being used, define what qualifies as high risk, train different audiences, establish oversight practices, and create documentation that demonstrates compliance. 

And the stakes are significant. The EU AI Act allows for fines of up to €15 million or 3% of global annual revenue for high-risk obligation violations, along with the broader risks of regulatory scrutiny and reputational impact  

For HR and compliance leaders, the responsibility is familiar. Organizations that act now to ensure employees can make sound AI decisions will be far better positioned when those expectations become enforceable. 

About the Author

John Brushwood serves as Compliance Counsel at Traliant, where he oversees regulation, solutions and topics related to data privacy, cybersecurity and AI governance. He is a graduate of St. Petersburg College and George Washington University Law School and has worked at various law firms, including Griffin & Griffin in Washington DC. 

    Ready to see the training in action?