Medical Device SaMD Governance

Your algorithms diagnose. Your governance doesn't.

Software as a Medical Device executes clinical decisions across millions of patients. Regulators require post-market surveillance, predetermined change control plans, and continuous algorithm monitoring. The EU AI Act enforcement begins August 2026. EU MDR compliance deadline is 2027. FDA issued its first AI-specific warning letter in April 2026. The governance gap between what algorithms decide and what compliance covers is a patient safety risk.

10
Detection patterns
5
Governance modules
21
Behavioral monitor methods
10
Compliance frameworks
8ms
Algorithm halt response

Software as a Medical Device without governance

A CPAP recall affected 15 million devices, caused 561 reported deaths, and cost $1.1B+. A widely deployed sepsis prediction model achieved only 14.7% sensitivity in independent validation -- missing two-thirds of sepsis patients across hundreds of hospitals. An oncology AI recommended unsafe treatments after being trained on synthetic data, resulting in a $4B+ write-off. In every case, the algorithm executed as designed. What failed was governance -- no behavioral monitoring, no drift detection, no population validation, no kill switch.

Every medical device platform governs the algorithm's output: sensitivity, specificity, AUC. No platform governs the algorithm's behavior: diagnostic drift, population shift, dosage boundary violations, unauthorized model updates. Agentomy closes that gap.

10 detection patterns mapped to real incidents and real regulations

Every pattern references a documented incident, a specific regulatory requirement, and a concrete detection method. The 21-method behavioral monitor runs continuously across the device software lifecycle. No theoretical threats. No generic compliance language.

Algorithm Drift
Algorithm Drift Detection
Clinical algorithm deviating from validated behavioral baseline. Cosine distance from 30-day rolling baseline. Flag drift above 0.3, halt above 0.6. Detects the slow degradation that turns a validated model into an unsafe one.
Critical
Population Shift
Population Shift Detection
Algorithm operating on a patient population outside its validated demographic scope. Monitors age, sex, ethnicity, comorbidity distributions against the validation cohort. Catches the gap between clinical trial populations and real-world patients.
Critical
Dosage Boundary Violation
Dosage Boundary Violation
Algorithm recommending or controlling dosages outside validated therapeutic ranges. Hard limits per drug class with no soft override. Catches insulin over-delivery, infusion rate errors, and contraindicated dosage combinations before they reach the patient.
Critical
Unauthorized Model Update
Unauthorized Model Update
Clinical model replaced or modified without governance approval. Model version changes require explicit authorization in the governance trail. Detects mid-deployment algorithm swaps that bypass predetermined change control plans.
Critical
Audit Gap
Audit Gap Detection
Device software operating without producing required evidence records. Gaps in the audit trail mean gaps in regulatory defensibility. 21 CFR Part 11 requires complete, tamper-evident records for every electronic action in a regulated process.
High
Alert Fatigue
Alert Fatigue Detection
Clinical alerts being systematically dismissed or ignored. Monitors override rates per alert type. When override rates exceed clinical norms, the alert system has become noise -- not safety. Detects the degradation that led to missed sepsis cases across hospital networks.
High
Consent Boundary Violation
Consent Boundary Violation
Algorithm making clinical decisions beyond the scope of patient consent or regulatory authorization. Detects when a screening tool starts making diagnostic decisions, or when a monitoring device exceeds its cleared indications for use.
Critical
Interoperability Failure
Interoperability Failure
Device software failing to correctly exchange data with connected clinical systems. Monitors FHIR message integrity, HL7 translation accuracy, and cross-system data consistency. Catches the Bluetooth failures and app-device communication breakdowns that caused insulin pump recalls.
Critical
Recall Response Failure
Recall Response Failure
Device continuing to operate after a recall has been issued. Monitors recall databases and enforces immediate behavioral changes when a recall affects the governed device or its components. Catches devices that operate for years after safety notices.
Critical
Validation Gap
Validation Gap Detection
Algorithm deployed without adequate clinical validation evidence. Detects missing demographic reporting, absence of randomized controlled trial data, and gaps between claimed performance and measured outcomes. 95.5% of recalled AI devices failed to report demographics.
High

5 governance layers for the medical device software lifecycle

Each layer enforces one aspect of SaMD governance -- from individual clinical decision validation to fleet-wide emergency halt across all connected devices.

Clinical Decision Gate
Every clinical algorithm output validated against approved therapeutic boundaries before reaching the patient. Blocks out-of-range dosages, flags contraindicated recommendations, enforces indication-for-use scope.
Algorithm Version Tracker
Tracks every model version with hash verification. Maps changes to PCCP-authorized modification boundaries. Unauthorized changes trigger immediate quarantine. Generates the validation evidence that predetermined change control plans require.
Population Scope Guard
Enforces demographic boundaries from the validation cohort. Monitors patient population characteristics in real time. Flags when real-world usage drifts outside the clinically validated population.
Recall Response Controller
Monitors FDA MAUDE, EUDAMED, and manufacturer recall databases. Enforces immediate behavioral changes when a recall affects governed devices. No manual intervention required. No delay between recall issuance and device response.
Compliance Evidence Generator
Produces audit-ready evidence packages for FDA 21 CFR Part 11, EU MDR post-market surveillance, and EU AI Act Article 72 monitoring obligations. Hash-chain tamper-evident records. Export in regulatory submission formats.

10 frameworks, real enforcement deadlines, real penalties

Every control mapping references the actual regulatory document. No generic compliance language. All mappings are self-assessed, pending external validation. Penalty exposure ranges from FDA warning letters to EUR 35M under the EU AI Act.

Framework Deadline Scope
FDA PCCP Guidance Final Dec 2024 Predetermined change control plans for all AI-enabled device software. Allows post-market model changes without new 510(k) -- if validated within plan.
EU MDR 2017/745 2027 Full application for SaMD. Article 17 (software reliability), Rule 11 (SaMD classification), Articles 83-86 (post-market surveillance). Legacy MDD certificates expiring.
EU AI Act (2024/1689) Aug 2026 / Aug 2027 SaMD classified as high-risk under Article 6(1). Risk management, transparency, human oversight required. Penalties up to EUR 35M or 7% global turnover.
FDA 21 CFR Part 11 Active Electronic records and signatures. Requires audit trails, access controls, system validation. Every governance action must be signed and immutable.
HIPAA Security Rule Active (updates 2026) Protects ePHI. Proposed 2024 updates eliminate addressable/required distinction -- all safeguards mandatory. $100-$50K per violation, criminal penalties up to 10 years.
FDA QMSR (21 CFR 820) Feb 2026 Quality Management System Regulation incorporating ISO 13485 by reference. Design controls, CAPA, complaint handling. 47 FDA warning letters in FY2024 (96% YoY increase).
ISO 13485:2016 Active Medical device QMS. Required for CE marking (EU), FDA compliance (US via QMSR), and most international markets. Non-compliance means you cannot sell.
IEC 62304 Active Medical device software lifecycle processes. Software safety classification (A/B/C). Required for EU MDR compliance and FDA recognized consensus standard.
IEC 82304-1 Active Health software product requirements for standalone software. Product-level safety, quality, and security through lifecycle. Required for EU market access.
FDA AI/ML Action Plan Ongoing Five-part framework. Part 5 (real-world performance monitoring) is Agentomy's direct market. PCCP guidance, GMLP principles, transparency requirements.

561 deaths, $5B+ in losses, and a systemic governance gap

CPAP Device Recall, 2021-ongoing
15M devices recalled, 561 reported deaths
Foam degradation in sleep therapy devices. Class I recall. 116,000+ adverse events. $1.1B+ in costs. 6,000 layoffs. FDA consent decree. Devices continued operating for years after safety issues were identified. No automated recall response. No continuous post-market behavioral monitoring.

Detected by: Recall Response Failure, Audit Gap, Algorithm Drift

Sepsis Prediction Model, 2021-ongoing
14.7% sensitivity -- missed 2/3 of sepsis patients
Proprietary black-box algorithm deployed in hundreds of hospitals. Independent validation showed it missed two-thirds of sepsis cases in a 6-hour window. Hospitals had no way to detect the performance gap because the algorithm was a closed system with no external behavioral monitoring.

Detected by: Algorithm Drift, Population Shift, Alert Fatigue, Validation Gap

Oncology AI, 2013-2022
$4B+ failure -- unsafe treatment recommendations
AI recommended contraindicated cancer treatments. Trained on synthetic data rather than real patient outcomes. 88% rejection rate by oncologists. $62M project killed. Entire health division sold for parts. No behavioral validation against real clinical outcomes. No population scope enforcement.

Detected by: Dosage Boundary Violation, Validation Gap, Population Shift

Four entry paths to governed medical device software

Connect any clinical platform through the protocol that fits your infrastructure. Gate mode for pre-decision authorization. Observer mode for post-market monitoring. Both modes produce the same audit trail.

MCP
Model Context Protocol
Native MCP integration for AI-driven clinical systems. Governance decisions flow through the same context window as clinical signals.
SDK
TypeScript / Python / Go
First-class SDK adapters for clinical infrastructure. FHIR-native integration. Go edge binary for on-premise deployments in regulated environments.
CLI
Command Line Interface
Governance operations from the terminal. Algorithm halt, fleet status, compliance evidence export, benchmark execution. Scriptable for CI/CD pipelines.
REST
REST API
Standard HTTP endpoints for any clinical platform. Pre-decision authorization, post-market reporting, fleet halt, audit trail queries. Platform-agnostic by design.

20 medical device governance scenarios. Run it yourself.

Suite 8: Medical Device SaMD Governance. 20 self-contained, idempotent scenarios across 4 coverage areas: authorization (5), audit trail (5), clinical boundary enforcement (5), and behavioral monitoring (5). Every scenario runs against the live governance layer. No mocks. No stubs.

# Run the medical device governance benchmark $ npx agentomy-bench --suite medical-device # Run a specific coverage area $ npx agentomy-bench --suite medical-device --area clinical-boundary # Export results for compliance evidence $ npx agentomy-bench --suite medical-device --export json

What we are and what we are not

Three commands to governed medical device software

# Install the governance adapter $ npm install @agentomy/governance # Authorize a clinical algorithm decision (pre-decision gate) $ curl -X POST http://localhost:3000/api/claw/authorize \ -H "Content-Type: application/json" \ -H "X-API-Key: YOUR_API_KEY" \ -d '{"agentId": "samd-diagnostic-radiology-01", "action": "write", "scope": "clinical_decision", "metadata": {"deviceClass": "II", "indication": "radiology_screening", "algorithmVersion": "v2.3.1"}}' # Emergency halt -- all governed medical device algorithms $ curl -X POST http://localhost:3000/api/claw/halt \ -H "Content-Type: application/json" \ -H "X-API-Key: YOUR_API_KEY" \ -d '{"reason": "algorithm drift detected", "operatorId": "regulatory-affairs-01"}'

Govern your algorithms before regulators do it for you.

EU AI Act enforcement begins August 2026. EU MDR compliance deadline is 2027. FDA issued its first AI-specific warning letter in April 2026. The compliance gap is closing.

Request Access