ALAI

Security and privacy

Privacy, GDPR and the EU AI Act: our standard before public launch

ALAI handles highly sensitive material: personal texts, voice, images, family relationships, memorial content and digital twin assets. Privacy and compliance are therefore not a side topic, but part of the product architecture itself.

Current position

ALAI is still in pre-launch. That means we do not present the service as “already fully compliant” without a complete review. Instead, we make an explicit commitment: the product should go live only with a concrete GDPR compliance path and a documented alignment process with the European AI Act.

GDPR commitment

  • Explicit donor consent for collection and use of submitted content.
  • Clear role separation between donor, authorized heirs and administration.
  • Data minimization and access separation.
  • Protection of personal data during usage, storage and administrative access.
  • Deletion, review and control rights to be implemented in a verifiable way before public launch.

EU AI Act commitment

  • Transparency about what the twin is and what it is not.
  • No misleading promises or deceptive anthropomorphic framing.
  • Traceability of the AI systems used for twin, voice and content generation.
  • Controls over access, data usage and risk management.
  • Internal governance to document limits, responsibilities and compliance measures before public release.

What this means in practice

Before the app is publicly available, ALAI should be able to demonstrate not only intentions but actual controls: updated notices, valid legal bases, defined retention, handling of data subject rights, technical and organizational safeguards, AI transparency and review of applicable risks.

A twin within real limits

ALAI should remain consistent with the material provided by the donor. The goal is a controlled conversational memory, not an unrestricted simulation. This principle matters both for user trust and for regulatory compliance.