Regulation without an AI winter

GDPR, EU AI Act, and medical confidentiality: regulate with proportionality, preserve trust, and protect sovereignty.

TL;DR
  • Broad data capture and opaque reuse are incompatible with GDPR principles.
  • Preventive, non-diagnostic AI can be lawful with human oversight and minimal data.
  • Extraterritorial surveillance risk must not override European medical secrecy.

Responsible health AI must remain governed by European law and values. Regulation should frame risk without freezing innovation.

Regulate without freezing. Frame without confiscating.

GDPR and medical confidentiality

Health data is protected by strict confidentiality obligations. Models relying on broad data capture and opaque reuse are structurally incompatible with GDPR principles of minimization, purpose limitation, and proportionality.

EU AI Act

Preventive, non-diagnostic AI does not equate to clinical high-risk AI when:

  • no diagnosis is produced
  • no medical decision is automated
  • human oversight is preserved
  • data processing is minimal and purpose-bound

Cloud Act risk

Dependence on extraterritorial legal regimes capable of overriding medical secrecy creates unacceptable legal uncertainty for European citizens. Responsible health AI must remain governed by European law.