Compliance Deadline
High-risk AI system provisions become applicable on August 2, 2026
Table of Contents
- Introduction: The Regulatory Demand for Provable Trust
- Article 12: The Core Logging Obligation
- Article 19: Retention Periods and Financial Sector Provisions
- Article 73: Evidence Preservation and Legal Risk
- Article 15: Cybersecurity and Tamper Resistance
- Gap Analysis: Minimum vs. Defensible Compliance
- Technical Architecture of Cryptographic Audit Trails
- GDPR Compatibility: The Crypto-Shredding Solution
- Leveraging eIDAS Qualified Timestamps
- Implementation Roadmap
- Conclusion: The Era of "Verify, Don't Trust"
Introduction: The Regulatory Demand for Provable Trust
On July 12, 2024, the EU AI Act (Regulation (EU) 2024/1689) was published in the Official Journal of the European Union, entering into force on August 1, 2024. The obligations for high-risk AI systems become applicable on August 2, 2026.
The impact of this regulation on the financial industry—particularly algorithmic trading and AI-driven decision-making systems—is profound. However, there's a critical point many operators are missing: the EU AI Act implicitly requires not just the existence of logs, but their evidentiary reliability.
This article provides an in-depth analysis of the EU AI Act's logging provisions, explains why conventional "mutable database logs" are becoming insufficient for regulatory compliance, and demonstrates the technical advantages that cryptographic audit trails provide.
Article 12: The Core Logging Obligation
Article 12 constitutes the core logging obligation under the EU AI Act.
Article 12(1) - Basic Obligation
"High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system."
This is a design-time requirement. It demands not merely "keeping logs" but "having the technical capability to automatically record logs." Systems lacking logging functionality cannot be placed on the EU market at all.
What the Article Does NOT Specify
Critically important is what Article 12 does not specify:
| Unspecified Element | Implication |
|---|---|
| Log format | JSON, binary, structured/unstructured—provider's choice |
| Storage architecture | Local, cloud, distributed—no mandate |
| Integrity protection mechanisms | No requirement for checksums, signatures, or hash chains |
| Third-party verifiability | No explicit requirement for external auditability |
Cryptographic Approach Compatibility
Article 12 does not prohibit cryptographic logging. And when combined with other provisions, the cryptographic approach becomes de facto advantageous:
- Article 12(1) "Recording over the lifetime" → Hash chains prove temporal continuity
- Article 12(3)(d) "Identification of verifiers" → Digital signatures provide non-repudiation
- Article 12(2) "Appropriate traceability" → Merkle trees enable efficient partial verification
Article 19: Retention Periods and Financial Sector Provisions
Article 19 establishes retention requirements with important provisions for financial institutions:
"...the logs shall be kept for a period appropriate to the intended purpose of the high-risk AI system, of at least six months..."
Financial Sector Provisions
Article 19(2) establishes that existing record-keeping obligations under financial regulations apply directly:
| Regulation | Retention Period |
|---|---|
| MiFID II Article 16(6) | 5 years |
| MAR Article 16(1) | 5 years |
| EMIR | 5+ years |
| National regulations | 5-10 years |
The Integrity Problem
Over retention periods spanning 6 months to 10 years, how do you prove log integrity becomes the practical challenge:
The Retention Period Challenge
Traditional approach: "Trust us"
Cryptographic approach: Mathematical proof
Article 73: Evidence Preservation and Legal Risk
Article 73 imposes the most stringent time constraints under the EU AI Act.
Reporting Deadlines
| Incident Type | Deadline |
|---|---|
| Standard serious incidents | 15 days |
| Suspected death or serious injury | 10 days |
| Widespread infrastructure disruption | 2 days |
Evidence Preservation Prohibition
Article 73(6) contains a decisive provision:
"The provider shall cooperate with the competent authorities and shall not perform any investigation which involves altering the AI system concerned in a way which may affect any subsequent evaluation of the causes of the incident..."
This clearly prohibits evidence tampering. The problem is that with "mutable logs," you risk unintentional violation of this provision.
Sanctions under Article 99
Violations of Article 73, particularly providing "misleading, incorrect or incomplete information," are subject to:
Up to €15,000,000 or 3% of worldwide annual turnover
whichever is higher
Article 15: Cybersecurity and Tamper Resistance
Article 15 specifies accuracy, robustness, and cybersecurity requirements:
"High-risk AI systems shall be resilient against attempts by unauthorised third parties to alter their use, outputs or performance by exploiting system vulnerabilities."
Mapping to Cryptographic Countermeasures
| Article 15(5) Threat | Cryptographic Countermeasure |
|---|---|
| Unauthorized output alteration | Cryptographic signing, hash chains |
| Data poisoning | Cryptographic provenance of training data |
| Model poisoning | Hash commitments for model weights |
| Adversarial attacks | Timestamped input-output logging |
| Unauthorized changes | Digital signatures for attribution |
Gap Analysis: Minimum Compliance vs. Defensible Compliance
The EU AI Act specifies minimum obligations, but minimum compliance is not necessarily the optimal strategy.
| Aspect | Minimum Compliance | Defensible Compliance |
|---|---|---|
| Article 12 | Mutable DB logs | Hash-chained logs |
| Article 19 | 6-month retention | Cryptographically timestamped long-term retention |
| Article 73 | Reactive response | Tamper-evident instant evidence |
| Annex IV | Handwritten signatures | Ed25519 digital signatures |
| Conformity Assessment | Self-declaration | Third-party verifiable evidence |
Defensible Compliance Advantages
- Mathematically verifiable integrity
- Independent third-party verification
- Trust-building with regulators
- Enhanced evidentiary value in litigation
- Competitive advantage ("cryptographically verifiable")
- Insurance premium optimization
Technical Architecture of Cryptographic Audit Trails
EU AI Act-compliant cryptographic audit trails consist of the following elements:
Core Components
| Layer | Function |
|---|---|
| Event Layer | Structured events with timestamps, actor ID, action type |
| Chain Layer | SHA-256 hash linking for tamper detection |
| Signature Layer | Ed25519 signatures for non-repudiation |
| Anchor Layer | Merkle roots with external timestamps |
| Verification Layer | Independent integrity verification tools |
Key Event Types for EU AI Act
- System operations: SYSTEM_START, SYSTEM_STOP, CONFIG_CHANGE
- AI operations: INFERENCE_REQUEST, INFERENCE_RESPONSE, MODEL_UPDATE
- Human oversight (Article 14): HUMAN_OVERRIDE, HUMAN_VERIFICATION, STOP_BUTTON_ACTIVATED
- PMM (Article 72): PERFORMANCE_METRIC, ANOMALY_DETECTED, CORRECTIVE_ACTION
- Incidents (Article 73): INCIDENT_DETECTED, INCIDENT_REPORTED
GDPR Compatibility: The Crypto-Shredding Solution
There exists a tension between EU AI Act Article 19 retention obligations and GDPR Article 17 right to erasure.
The Regulatory Tension
EU AI Act: "Retain logs for 6 months to 10 years"
GDPR: "Delete personal data upon request"
The Solution: Architectural Separation
- Audit Integrity Layer (Immutable)
- Hash chains, Merkle roots, timestamps
- Contains NO personal data
- Personal Data Layer (Deletable)
- Encrypted personal data with per-subject keys
- Crypto-shredding enabled (key destruction = data destruction)
When an erasure request is received, the encryption key is destroyed. The encrypted data becomes permanently unrecoverable, but the audit trail (containing only hashes) remains intact.
Leveraging eIDAS Qualified Timestamps
Qualified Time Stamps under the eIDAS Regulation have legal effect throughout the EU:
"A qualified electronic time stamp shall enjoy the presumption of the accuracy of the date and the time it indicates and the integrity of the data to which the date and time are bound."
— eIDAS Article 41(2)
Integration Benefits
- Satisfies Annex IV "dated and signed" requirement
- Proves integrity throughout 10-year retention
- Evidentiary value in Article 73 investigations
- Cross-border legal effect (EU27)
Implementation Roadmap: Preparing for August 2026
Phase 1: Foundation (Now → Q2 2025)
- Implement hash-chained event logging
- Build Ed25519 signature infrastructure
- Document architecture in technical documentation
- Design GDPR compatibility (crypto-shredding layer)
Phase 2: Verification Enhancement (Q2 2025 → Q4 2025)
- Implement Merkle tree aggregation
- Integrate eIDAS qualified timestamps
- Prepare authority reporting templates
- PMM (Post-Market Monitoring) integration
Phase 3: Hardening (Q4 2025 → August 2026)
- Conformity assessment dry run
- Third-party audit of cryptographic controls
- Incident response procedure validation
- CEN-CENELEC harmonised standards preparation
Conclusion: The Era of "Verify, Don't Trust"
The EU AI Act does not explicitly mandate cryptographic audit trails. However, the combination of provisions creates pressure making the cryptographic approach the de facto standard:
| Article | Requirement | Cryptographic Solution |
|---|---|---|
| Article 12(1) | Lifetime logging | Hash chains prove continuity |
| Article 15(5) | Tamper resistance | Cryptographic integrity verification |
| Article 73(6) | Evidence preservation | Automatic preservation via immutable logs |
| Article 18(1) | 10-year retention | Timestamps prove long-term integrity |
| Annex IV(2)(g) | Dated and signed | Ed25519 + eIDAS qualified timestamps |
"Trust me" is no longer acceptable.
Regulators, auditors, and the market demand verifiable evidence.
Verify, Don't Trust.
August 2026 is closer than you think.
Reference Links
- EU AI Act Official Text (EUR-Lex)
- CEN-CENELEC JTC 21 AI Standardisation
- VeritasChain Protocol Specifications
- IETF Draft: VCP (draft-kamimura-scitt-vcp)
- GitHub: VeritasChain
This article is provided by the VeritasChain Standards Organization (VSO) for technical educational purposes and does not constitute legal advice. Please consult with qualified professionals for specific regulatory compliance matters.
Contact: technical@veritaschain.org
Share this article: