Skip to content
TECHNOMATON | Docs SAI Certified Trainers

L1 Output Documents

L1 — Align generates 9 documents divided into two categories: 5 policy documents (P1—P5) and 4 training materials (T1—T4). Each document addresses a specific regulatory requirement or operational need.

Summary table

CodeNameVerdictPrimary legal basis
P1Acceptable Use Policy for AIMandatoryAI Act Art. 4, GDPR, Labour Law
P2AI Reference CardNice-to-have
P3Data Classification DirectiveMandatoryGDPR Art. 5, 9, 32
P4Decision Tree for New AINice-to-have
P5Incident Reporting ProtocolMandatoryGDPR Art. 33/34, NIS2 Art. 23, AI Act Art. 62
T1Training PresentationStrongly recommendedAI Act Art. 4, GDPR Art. 39
T2Knowledge QuizNice-to-have— (audit evidence)
T3Employee FAQNice-to-have
T4Data Act AwarenessConditionally relevantData Act (depending on scope)

Verdicts

  • Mandatory — Directly required by legislation. Without this document, sanctions may apply.
  • Strongly recommended — The law does not explicitly require this exact format, but without it, demonstrating compliance is difficult.
  • Nice-to-have — Not required by regulation, but significantly facilitates adoption and serves as supporting evidence.
  • Conditionally relevant — Depends on whether your organization falls within the scope of the respective regulation.

P1 — Acceptable Use Policy for AI

Problem: Employees use AI tools, but no formal rules exist.

What it contains:

  • Permitted and prohibited uses of AI
  • Rules for working with data (what may/may not be entered into AI)
  • Employee and management responsibilities
  • Conditions for introducing new AI tools
  • Control and review mechanisms

Regulatory validation:

  • AI Act Art. 4 — Organizations must ensure AI literacy; the policy is the formal framework
  • GDPR Art. 5 — Data minimization principles when working with AI
  • Labour Law — Employer responsibility for tool usage rules

Verdict: Mandatory — Without a documented policy, you have no way to demonstrate compliance with Art. 4 of the AI Act.


P2 — AI Reference Card

Problem: Employees need a quick overview of the rules, not a 20-page document.

What it contains:

  • One-page summary of key rules
  • DOs and DON’Ts for working with AI
  • Contacts for reporting issues
  • QR code linking to the full policy

Verdict: Nice-to-have — Facilitates practical implementation.


P3 — Data Classification Directive

Problem: Employees don’t know which data may be entered into AI tools.

What it contains:

  • 4 data classification levels (Public / Internal / Confidential / Strictly Confidential)
  • Examples of data in each category
  • Rules for AI processing by classification
  • Procedure for unintentional input of sensitive data

Regulatory validation:

  • GDPR Art. 5 — Data minimization principle
  • GDPR Art. 9 — Special categories of personal data
  • GDPR Art. 32 — Security measures proportionate to risk

Verdict: Mandatory — GDPR requires proportionate security measures, which presuppose data classification.


P4 — Decision Tree for New AI

Problem: Who decides on introducing a new AI tool, and how?

What it contains:

  • Flowchart for approving new AI tools
  • Evaluation criteria (security, GDPR, risk level)
  • Escalation matrix (who approves what)
  • Pre-deployment checklist

Verdict: Nice-to-have — Important for larger organizations where multiple people are involved in decisions.


P5 — Incident Reporting Protocol

Problem: What to do when an AI incident occurs (data leak, incorrect decision, bias)?

What it contains:

  • Definition of an AI incident
  • Reporting procedure (who, to whom, by when)
  • Escalation matrix by severity
  • Mandatory notifications to regulators
  • Incident record template

Regulatory validation:

  • GDPR Art. 33 — Notification to the supervisory authority within 72 hours
  • GDPR Art. 34 — Notification to the data subject in case of high risk
  • NIS2 Art. 23 — Reporting of significant incidents (if within NIS2 scope)
  • AI Act Art. 62 — Reporting of serious incidents with AI systems

Verdict: Mandatory — Mandatory incident reporting is required by GDPR, and conditionally by NIS2 and the AI Act.


T1 — Training Presentation

Problem: Employees need to be trained, but no standardized material exists.

What it contains: 24 slides covering complete AI literacy — from basics through the regulatory framework to company rules.

Regulatory validation:

  • AI Act Art. 4 — Requirement for AI literacy of personnel
  • GDPR Art. 39 — The DPO is tasked with training staff

Verdict: Strongly recommended — The most direct way to demonstrate AI literacy.


T2 — Knowledge Quiz

Problem: How to prove that employees actually understood the rules?

What it contains: 18 questions, 80% pass threshold, answer explanations, unique result code with HMAC integrity.

Verdict: Nice-to-have — Strong audit evidence. T1 + T2 = the strongest argument to the regulator.


T3 — Employee FAQ

Problem: After training, employees have recurring questions.

What it contains: 8 thematic sections with practical answers.

Verdict: Nice-to-have — Reduces the burden on IT/compliance departments.


T4 — Data Act Awareness

Problem: The Data Act introduces new obligations for companies with IoT and cloud data.

What it contains: Overview of the Data Act, user rights, obligations when switching cloud services.

Verdict: Conditionally relevant — Important if you fall within the scope of the Data Act.


Matrix: Regulation x Document

RegulationP1P2P3P4P5T1T2T3T4
AI Act Art. 4xx
AI Act Art. 5xx
AI Act Art. 6x
AI Act Art. 62x
GDPR Art. 5xx
GDPR Art. 9x
GDPR Art. 32x
GDPR Art. 33/34x
GDPR Art. 39x
NIS2 Art. 23x
Data Actx
Labour Lawx