HF3980 (Legislative Session 94 (2025-2026))

Online platforms required to provide information pertaining to algorithm use, design transparency and user choice required, civil penalties provided, and rulemaking authorized.

Related bill: SF4380

AI Generated Summary

Purpose

  • To protect Minnesota users by requiring online platforms to be transparent about how their recommendation algorithms work, provide clear design information, and give users real options and control over what is recommended to them. The bill also creates rules, reporting requirements, and penalties to ensure compliance.

Key concepts and definitions

  • Covered online platform: a platform that operates in Minnesota, uses one or more algorithmic recommender systems to rank or highlight items for users, and relies on user data (with certain exceptions for settings, device data, or independent search queries).
  • Algorithmic recommender system: the computer process that decides the order, ranking, or prominence of items shown to a user (posts, accounts, channels, products, ads, etc.).
  • Covered business and user: a business that operates a covered online platform and a user located in Minnesota.
  • Personal data: information linked or linkable to a person or device that can identify someone.
  • Holdout group: a group of users exempt from certain algorithm changes to test effects on performance and user experience.
  • Longterm holdout assessment: a process that runs changes to algorithm design for at least 12 months and measures longterm effects on users.
  • Longterm user value metrics: measurements that reflect outcomes aligned with a user’s longer-term preferences and goals.
  • Highvalue data: user-provided data or predictions drawn from user survey data.
  • Minor: a user who is considered a minor under objective standards and is known or should be known to be a minor by the platform.
  • Engagement data: data about how users interact with items (likes, comments, shares, dwell time, etc.).
  • User provided data: preferences, settings, search queries, prompts, and other explicitly given information (not including engagement data).

Main provisions

  • Design transparency requirements

    • Platforms must prominently disclose on their site:
    • Each algorithmic recommender system in use.
    • The inputs to each algorithm and the data sources for those inputs.
    • The weights used in each algorithm, grouped into four categories by their relative importance to the output.
    • The state’s commerce commissioner will issue rules to clarify these disclosures.
    • Platforms must annually disclose the platform’s high-level objectives, key results, and performance metrics used to evaluate product teams responsible for designing the algorithms.
  • User choice and default settings

    • For any service, product, or feature that uses personal data in an algorithm, the default setting must be configured to maximize one or more longterm user value metrics.
    • Platforms must provide an accessible user interface that lets users clearly and unambiguously tell the platform their preferences about the types of items shown or blocked.
    • Platforms must make reasonable efforts to ensure the algorithm outputs align with the user’s stated preferences.
    • Generally, a platform may not withhold, degrade, or raise prices of a product or feature as a punishment or consequence for a user exercising rights under these provisions (such as choosing a preferred item type).
  • Special rules for minors

    • For algorithmic recommendations that use personal data and are shown to a covered minor, the default must be set to maximize longterm user value metrics suitable for minors.
  • Longterm holdout assessments and disclosures

    • Platforms must maintain at least one holdout group and subject all design changes to a longterm holdout assessment.
    • Annually, platforms must publish a longterm holdout disclosure in an accessible location that includes:
    • The platform’s longterm user value metrics.
    • The aggregated, anonymized results for each metric for the holdout groups and for the rest of the user base.
    • The commerce commissioner will create rules governing how these holdout assessments work, including how to form holdout groups and what must be disclosed; there may be exemptions if a change reduces or prevents direct and immediate harm to users without increasing engagement or revenue.
    • Platforms operating a covered online platform must have an independent audit of their longterm holdout assessments at least once per year. Audits must be performed by a qualified independent auditor with full cooperation from the platform, including access to necessary information and operations.
  • Enforcement and remedies

    • Violations are considered unfair and deceptive acts under Minnesota law.
    • The attorney general may enforce these provisions.
    • A user injured by a violation may seek remedies, including:
    • Monetary damages—$5,000 per user per violation (adjusted annually for inflation) or actual damages, whichever is greater.
    • Punitive damages—$7,000 per violation (adjusted annually for inflation) or actual damages, whichever is greater, for reckless or knowing violations.
    • Reasonable attorney fees and litigation costs.
    • Other relief, including injunctive or declaratory relief as appropriate.
    • The section operates in harmony with other laws and constitutional protections, including First Amendment considerations and Section 230 protections, and does not override the highest consumer protections where conflicts exist.
  • Relationship to other laws

    • This section adds to and does not limit other Minnesota laws, regulations, and common law.
    • If there is a conflict between this section and another law, the law that provides the greatest consumer protection applies.

Practical implications

  • Online platforms in Minnesota would need to build and publish detailed disclosures about how their recommendation systems work, including inputs and weights.
  • Platforms would need to offer clear, accessible controls for users to set preferences about what is recommended or blocked, and ensure outputs align with user choices.
  • Platforms must run longterm holdout assessments, publish annual disclosures, and undergo independent audits.
  • Violations carry monetary and other remedies, with penalties adjusted for inflation over time.

Why this matters for users

  • Users get more transparency about why they see certain content and more control over what is shown to them.
  • There are formal mechanisms to test the long-term impact of algorithm changes on user experience and to hold platforms accountable for misleading or harmful practices.

Relevant terms algorithmic recommender system; covered online platform; covered business; user; personal data; holdout group; longterm holdout assessment; longterm user value metrics; highvalue data; user provided data; engagement data; input; data source; weights; quartile groups; accessible user interface; major/minor (covered minor); holdout disclosure; independent audit; unfair and deceptive act; enforcement; damages; CPI adjustment; First Amendment; Section 230; rulemaking; commissioner of commerce.

Bill text versions

Actions

DateChamberWhereTypeNameCommittee Name
March 05, 2026HouseActionIntroduction and first reading, referred toCommerce Finance and Policy

Citations

 
[
  {
    "analysis": {
      "added": [],
      "removed": [],
      "summary": "The bill states that nothing in this section should be construed in a manner inconsistent with the First Amendment of the United States Constitution, preserving free speech protections and not modifying First Amendment rights.",
      "modified": []
    },
    "citation": "U.S. Const. amend. I",
    "subdivision": ""
  },
  {
    "analysis": {
      "added": [],
      "removed": [],
      "summary": "The bill references federal law (Title 47, Section 230) to ensure compatibility with federal protections related to online platforms; the bill does not modify 230.",
      "modified": []
    },
    "citation": "47 U.S.C. § 230",
    "subdivision": ""
  },
  {
    "analysis": {
      "added": [],
      "removed": [],
      "summary": "The bill allows enforcement of this section by the attorney general under Minnesota Statutes section 8.31; no change to 8.31 itself.",
      "modified": []
    },
    "citation": "8.31",
    "subdivision": ""
  },
  {
    "analysis": {
      "added": [],
      "removed": [],
      "summary": "The act references existing Minnesota Statutes section 325D.44 (unfair and deceptive acts) as the basis for remedies; the bill does not modify 325D.44.",
      "modified": []
    },
    "citation": "325D.44",
    "subdivision": ""
  },
  {
    "analysis": {
      "added": [],
      "removed": [],
      "summary": "The act references existing Minnesota Statutes section 325D.45 for remedies and damages; the bill does not modify 325D.45.",
      "modified": []
    },
    "citation": "325D.45",
    "subdivision": ""
  }
]
Loading…