On 12 February 2026, the NSW Parliament passed Australia’s first AI-specific workplace safety law. The Digital Work Systems Act amends the Work Health and Safety Act 2011 (NSW) to require employers to consider how digital systems, including AI monitoring, automated scheduling, and algorithmic management, may create psychosocial hazards for workers.
Moore Australia published an analysis within days. Gadens included it in their year-in-review. Ius Laboris provided an international perspective. The consensus: this is a significant shift, and most SMEs have no idea it exists.
If you employ people in NSW, even remotely, this law applies to you. Here is what it requires and what you need to do.
2026: date the Act was passed by NSW Parliament
AI-specific workplace safety law in Australia
NSW employers using digital work systems are covered
A “digital work system” under the Act includes any technology-based system used to assign, monitor, or evaluate work. That covers AI scheduling tools (like Deputy or Humanforce with AI features), automated performance tracking systems, algorithmic task allocation platforms, digital surveillance tools, and any automated decision-making system that affects how workers do their jobs.
Standard office software like email, word processing, and basic spreadsheets are generally not covered unless they are being used primarily for monitoring or evaluation purposes. But the moment you add an AI layer to any of these tools, the Act likely applies.
The Act’s central requirement is that employers must consider how digital work systems may create or contribute to psychosocial hazards. This is the same concept already embedded in WHS regulations, but now explicitly extended to AI and digital systems.
Psychosocial hazards from digital work systems include excessive monitoring that creates anxiety and stress, unreasonable pace driven by algorithmic targets, lack of control when AI makes decisions about scheduling and task allocation, poor role clarity when automated systems change expectations without clear communication, and low recognition when performance is reduced to algorithm-generated metrics.
The research backs this up. Studies consistently show that algorithmic management increases worker stress when implemented without transparency. Delivery drivers managed by AI routing apps report higher burnout. Warehouse workers tracked by productivity algorithms experience more anxiety. The Act recognises that the technology itself can be a workplace hazard.
The Act integrates with your existing WHS obligations. If you already have a work health and safety management system, you need to extend it to cover digital work systems. If you do not, this is a good time to start.
Step 1: Identify your digital work systems. List every digital tool that assigns, monitors, or evaluates work. Include AI scheduling, time tracking, productivity monitoring, automated communications, and algorithmic decision-making tools. Most businesses are surprised by how many they use.
Step 2: Assess psychosocial risks. For each system, consider whether it creates excessive monitoring pressure, sets unreasonable work pace, reduces worker control over their tasks, obscures decision-making processes, or generates stress through constant evaluation.
Step 3: Consult with workers. The Act requires genuine consultation. Ask your employees how they experience these systems. You may discover that the scheduling algorithm that saves you time is creating unpredictable shifts that cause staff significant stress. Worker input is not optional.
Step 4: Implement controls. Where risks are identified, implement controls following the standard hierarchy: eliminate the hazard where possible, substitute with a less harmful system, implement engineering or administrative controls, or provide support to workers. This might mean adjusting monitoring levels, providing more transparency about how algorithms make decisions, or giving workers more control over AI-generated schedules.
Step 5: Review regularly. Digital work systems change frequently. AI tools are updated. New features are added. What was low-risk last quarter may be higher-risk now. Build digital work system reviews into your regular WHS review cycle.
For a broader compliance framework that covers the Privacy Act, anti-discrimination law, and industry-specific requirements alongside WHS, see our AI compliance checklist.
NSW is the first mover, but other states are paying attention. Victoria and Queensland have both signalled interest in digital work system regulation. The WHS framework is harmonised across most of Australia, which means NSW’s approach could be adopted by other jurisdictions relatively quickly.
Even if your business operates entirely outside NSW, treating this Act as a preview of national regulation is prudent. Implementing good practices now means you will not be scrambling when your state catches up. The core requirements, transparency about digital systems, consultation with workers, and consideration of psychosocial impacts, are simply good management practice regardless of legislation.
Combined with the Privacy Act reforms taking effect in December 2026 and the existing workplace surveillance laws, Australian employers now face a multi-layered compliance environment for AI in the workplace. The businesses that address all three together will be in a far stronger position than those that react to each one separately.
Our Free AI Audit covers governance and compliance alongside opportunity, so you can identify gaps before regulators do.