In Germany, your works council can stop an AI agent deployment dead in its tracks. Not by complaining, not by filing a grievance, but by exercising a legally binding veto under Section 87(1) No. 6 of the Works Constitution Act (BetrVG). Any technical system capable of monitoring employee behavior or performance requires the works council’s consent before it goes live. No consent, no deployment.
This is not a theoretical risk. According to AI project management surveys, roughly 60% of AI projects in German enterprises stall or fail because of unresolved works council objections. Companies that involve the Betriebsrat early, on the other hand, report three times higher adoption rates. The difference between a successful AI rollout and a blocked one often comes down to how seriously you take co-determination (Mitbestimmung).
For international companies operating in Germany, this system has no equivalent. There is no US or UK parallel to the Betriebsrat’s binding co-determination rights. Understanding how it works is not optional if you plan to deploy AI agents in a German workplace.
The Legal Framework: Where AI Meets BetrVG
The Works Constitution Act (Betriebsverfassungsgesetz, BetrVG) is Germany’s foundational law for workplace co-determination. It was last updated in 2021 through the Works Council Modernization Act (Betriebsrätemodernisierungsgesetz), which explicitly addressed AI for the first time. Several provisions create direct obligations when deploying AI agents.
Section 87(1) No. 6: The Core Veto Right
This is the provision that stops most AI projects. The works council has a mandatory co-determination right over “the introduction and use of technical devices designed to monitor the behavior or performance of employees.”
The bar is low. The system does not need to be designed for monitoring. It only needs to be capable of it. Almost every AI agent deployed in a corporate environment meets this threshold. A customer service agent that logs interactions? Covered. An AI scheduling tool that tracks task completion? Covered. A coding assistant that records prompts and outputs? Likely covered, because the employer could theoretically access usage data.
If the works council does not agree, the employer cannot deploy the system. The only path forward is the Einigungsstelle (conciliation board), which functions like mandatory arbitration. That process takes months and the outcome is uncertain.
Section 90: Early Information and Consultation
Before any deployment decision, Section 90 BetrVG requires the employer to inform the works council at the planning stage about changes to work processes, including changes driven by AI. The 2021 amendment explicitly mentions AI in this context. This means informing the works council before you sign the contract with your AI vendor, not after.
Section 80(3): The Right to an External Expert
Here is where it gets expensive. The 2021 reform added a presumption that engaging an external expert is necessary when the works council must evaluate AI. Under the old law, the employer could argue that the works council’s existing knowledge was sufficient. Under the new Section 80(3) sentences 2 and 3, the need for an expert is legally presumed when AI is involved. The employer pays for it.
In practice, this means your works council will hire an external IT consultant or labor law specialist to evaluate your AI system. Budget EUR 5,000 to EUR 30,000 per engagement, depending on the complexity of the system.
Section 95: Selection Guidelines and AI
If your AI agent influences hiring decisions, internal transfers, layoffs, or performance evaluations, Section 95 BetrVG gives the works council co-determination rights over the selection criteria. The 2021 amendment explicitly extends this to situations where AI is used to establish those criteria. This directly intersects with the EU AI Act’s classification of employment AI as high-risk.
The Hamburg Ruling: What It Means and What It Does Not
On January 16, 2024, the Hamburg Labour Court issued the first German court decision on works council rights and AI (case 24 BVGa 1/24). The works council had sought an injunction to prevent employees from using ChatGPT at work.
The court ruled against the works council, but for narrow reasons that actually reinforce co-determination for most AI deployments.
What the court said: The employer had merely permitted employees to use ChatGPT through their own personal, browser-based accounts. Since the employer had no access to the AI provider’s data, it could not monitor which employees used ChatGPT, when, or for what purpose. Without access to usage data, there was no monitoring capability, and therefore no co-determination right under Section 87(1) No. 6.
What the court did not say: The ruling explicitly does not cover situations where the employer provides company ChatGPT accounts, deploys AI agents through company systems, or has any form of access to usage logs. In those cases, the monitoring capability exists, and the works council’s co-determination right applies.
For AI agent deployments, this distinction is critical. AI agents that operate within company infrastructure, access company data, and produce audit logs are squarely within Section 87(1) No. 6 territory. The Hamburg exception is narrow: it applies only to voluntary use of personal accounts with no employer data access.
What a Betriebsvereinbarung for AI Must Cover
The practical solution is a works agreement (Betriebsvereinbarung) that covers AI deployment. The Hans Böckler Foundation and the HUMAINE project both publish model agreements. Based on these templates and current legal requirements, a robust AI Betriebsvereinbarung should address:
Scope and definitions. Specify which AI systems are covered. A framework agreement (Rahmenbetriebsvereinbarung) covers all AI tools; individual agreements cover specific deployments. Define what counts as “AI” in your context to avoid disputes later. The EU AI Act’s definition in Article 3 is a reasonable starting point.
Positive and negative lists. Explicitly list which AI applications are permitted (e.g., code completion, meeting transcription) and which are prohibited (e.g., emotion recognition, performance scoring without human review). This is where most negotiations happen.
Data protection. Specify what data the AI system collects, who has access, how long it is stored, and when it is deleted. This must align with GDPR requirements and, for employee data specifically, the German Federal Data Protection Act (BDSG Section 26).
Performance monitoring exclusions. Explicitly state that AI system data will not be used for individual performance evaluation, disciplinary action, or termination decisions unless both sides agree otherwise.
Transparency and explainability. Define what information employees receive about AI systems used in their work. Include the right to request explanations for AI-driven decisions that affect them.
Review and audit rights. Give the works council the right to audit AI systems periodically, request algorithmic impact assessments, and bring in external experts. Specify review intervals (quarterly or semi-annually is common).
Change management. Define the process for introducing new AI tools or significantly changing existing ones. Include a notification period (30 days is typical) before any new AI system goes live.
Where the EU AI Act Intersects with Co-Determination
Starting August 2, 2026, the EU AI Act’s high-risk provisions become enforceable. For German companies, this creates a dual compliance layer that most enterprises have not yet mapped.
The EU AI Act requires deployers of high-risk AI (which includes most employment-related AI under Annex III) to implement human oversight, conduct fundamental rights impact assessments, and inform employees. BetrVG requires works council involvement for the same systems.
These are not parallel tracks. They reinforce each other. Article 26(7) of the EU AI Act explicitly requires deployers to inform workers’ representatives about high-risk AI deployment. In Germany, that means the Betriebsrat. And Article 26(7) does not replace BetrVG obligations; it adds to them. You need both the EU AI Act’s fundamental rights impact assessment and the works council’s consent under Section 87.
Companies that treat the EU AI Act as a standalone compliance exercise and forget BetrVG will find themselves technically in compliance with EU law but in violation of German labor law. The works council can still block the deployment regardless of your EU AI Act documentation.
Practical Playbook: How to Navigate Works Council AI Negotiations
Start before you buy. Section 90 requires information at the planning stage. Involve the works council before vendor selection, not after. This also prevents the perception that the decision has already been made.
Budget for the expert. Under Section 80(3), the works council will hire one. Treat this as a standard project cost, not an adversarial expense. A good external expert can actually accelerate negotiations by translating technical capabilities into language both sides understand.
Propose a framework agreement. Rather than negotiating each AI tool individually, offer a Rahmenbetriebsvereinbarung that sets general principles and a fast-track process for low-risk AI tools. Reserve detailed individual agreements for high-risk systems (anything touching performance data, hiring, or customer-facing decisions).
Demonstrate the Hamburg distinction. If your AI tool genuinely does not give the employer access to individual usage data, the Hamburg ruling may apply. But document this carefully. If there is any path for the employer to access employee-specific data from the AI system, assume co-determination applies.
Align with your DPO. The data protection impact assessment (DPIA) required under GDPR and the works council’s information rights overlap significantly. Prepare one comprehensive assessment that serves both audiences. Your Data Protection Officer should be in the room during works council negotiations.
Plan for 2026 works council elections. Germany holds works council elections every four years. The next round is in early 2026. Newly elected works councils may revisit existing AI agreements. Build sunset and review clauses into your Betriebsvereinbarung to avoid having to start from scratch.
Frequently Asked Questions
Can a German works council block AI deployment?
Yes. Under Section 87(1) No. 6 BetrVG, the works council has a binding co-determination right over any technical system capable of monitoring employee behavior or performance. If the works council does not consent, the employer cannot deploy the system. The only recourse is the Einigungsstelle (conciliation board), which functions as mandatory arbitration.
Does using ChatGPT at work require works council approval in Germany?
It depends. The Hamburg Labour Court ruled in January 2024 that permitting employees to use ChatGPT via personal accounts does not trigger co-determination rights, because the employer has no access to usage data. However, if the employer provides company accounts or deploys ChatGPT through company systems where usage can be tracked, co-determination under Section 87(1) No. 6 BetrVG applies.
What changed for works councils and AI in the 2021 Betriebsrätemodernisierungsgesetz?
The Works Council Modernization Act of 2021 explicitly addressed AI in three key areas: Section 90 now requires employers to inform and consult works councils at the planning stage of AI deployment. Section 80(3) creates a legal presumption that an external expert is necessary when the works council evaluates AI systems, with costs borne by the employer. Section 95 extends co-determination over selection guidelines to situations where AI establishes those criteria.
How does the EU AI Act interact with German works council rights?
The EU AI Act and German co-determination law create overlapping obligations. Article 26(7) of the EU AI Act requires deployers to inform workers’ representatives about high-risk AI, which in Germany means the Betriebsrat. But this does not replace BetrVG obligations. Companies need both EU AI Act compliance and works council consent under Section 87 BetrVG. Satisfying one does not satisfy the other.
What should an AI Betriebsvereinbarung (works agreement) contain?
A robust AI Betriebsvereinbarung should cover: scope and AI definitions, positive/negative lists of permitted and prohibited AI applications, data protection and retention rules, performance monitoring exclusions, transparency requirements for employees, works council audit and review rights, and a change management process for introducing new AI tools. Model agreements are available from the Hans Böckler Foundation and the HUMAINE project.
