Policy Snapshot

Giving citizens a direct ownership stakes in AI infrastructure via equity stakes

Rate of Disruption

Gradual

All Scenarios

Rapid

Risk Horizon

Near Term

Medium Term

Long Term

Governance

Subnational

National

International

Who It Affects

Decision Maker

/

Regulation & Market Design

/

Competition & Corporate Governance

Fiduciary Mandates

Legal duties of loyalty, care, and confidentiality imposed on AI developers and deployers, requiring them to act in users' and society's interests rather than solely maximizing profit.

What it is:

Fiduciary mandates would require AI developers to ensure systems act in users' and society's genuine interests rather than optimizing solely for profit or engagement metrics that may accelerate labor displacement and exacerbate inequality. 

Companies face inherent tensions between maximizing efficiency gains and protecting public welfare. Fiduciary mandates would convert vague ethical commitments into legally enforceable obligations: the duty of loyalty would prohibit deploying systems that knowingly displace workers faster than retraining can occur.

Recommended Reading:
Claire Boine

September 2023

Boine argues that while information fiduciaries focus on data collection, "AI fiduciaries" must address broader harms from system behaviors. She proposes that AI companies meet classic fiduciary criteria: users entrust power to systems they cannot monitor, creating vulnerability that requires legal protection. Under her framework, the duty of loyalty becomes "goal and value alignment", the capacity of AI systems to act with the same interests as beneficiaries.

Real-world precedents:

Fiduciary duties have governed professional relationships for centuries—physicians' Hippocratic obligations, attorneys' duties to clients, and financial advisors' obligations under the Investment Advisers Act of 1940.

Securing humanity's AI future

© 2026 Windfall Trust. All rights reserved.