Abstract
In 2023, hackers breached 23andMe and extracted the biometric and genealogical data of nearly seven million people. By 2025, that data – originally offered up in the spirit of medical discovery and consumer empowerment – was being auctioned off in bankruptcy court as a corporate asset. The breach exposed more than technical lapses or governance failures. It revealed a structural gap in how the law understands, regulates, and protects biometric identity. Biometric data is intimate, immutable, and implicates human dignity and autonomy. And yet, the legal frameworks most often tasked with protecting it – tort, contract, and even modern privacy statutes – have consistently failed to impose meaningful obligations on the entities that collect, retain, and monetize it. This Article joins a growing body of scholarship urging courts to take bailment seriously as a tool for digital governance, and pushes that conversation forward by making the case for its specific application to biometric data. Bailment, a longstanding common law doctrine, imposes duties not through contract but through custody. It recognizes obligations where property is entrusted, even implicitly, and it treats the breach of those obligations as actionable harm – sidestepping procedural roadblocks that frustrate most privacy litigation. Courts have already extended bailment to encrypted records, digital identifiers, and biological materials such as cryogenically stored embryos. Biometric data fits naturally within that trajectory. Critically, this Article addresses the most entrenched obstacle to biometric bailment: the widespread use of contractual waivers, disclaimers, and arbitration clauses to shield data holders from liability. It argues that these contractual shields are not absolute – and that existing common law doctrines, including unconscionability, public policy limits, and judicial flexibility in recharacterizing relationships, give courts the tools to restore baseline obligations even in the face of formally disclaimed duties. Despite its doctrinal fit and practical advantages, bailment remains dramatically underused in the data privacy context. This Article argues that courts should reclaim it – not as a novel intervention, but as a functional, flexible, and doctrinally grounded response to a rapidly expanding category of legal exposure. In doing so, it offers a path toward real accountability for those who profit from data that cannot be revoked, replaced, or forgotten.
© 2025 Walter de Gruyter GmbH, Berlin/Boston
Artikel in diesem Heft
- Frontmatter
- Articles
- AI Liability Along the Value Chain: Lessons from the Liability of Suppliers of Components in Product Liability Law
- Tort Liability for Failure to Age Gate: A Promising Regulatory Response to Digital Public Health Hazards
- Swords and Shields: Impact of Private Standards for Liability Determinations of Autonomous Vehicles
- Digital Malpractice: The Role of Professional Negligence and Public Policy in the Regulation of Digital Platforms
- Algorithms and the Privacy Torts
- Bailing Out Biometrics
- A Novel Tort Duty for Platforms That Intermediately Produce Real World User Interactions
- Essay
- Virtual Dignitary Torts
Artikel in diesem Heft
- Frontmatter
- Articles
- AI Liability Along the Value Chain: Lessons from the Liability of Suppliers of Components in Product Liability Law
- Tort Liability for Failure to Age Gate: A Promising Regulatory Response to Digital Public Health Hazards
- Swords and Shields: Impact of Private Standards for Liability Determinations of Autonomous Vehicles
- Digital Malpractice: The Role of Professional Negligence and Public Policy in the Regulation of Digital Platforms
- Algorithms and the Privacy Torts
- Bailing Out Biometrics
- A Novel Tort Duty for Platforms That Intermediately Produce Real World User Interactions
- Essay
- Virtual Dignitary Torts