HF4452 (Legislative Session 94 (2025-2026))
Artificial intelligence chatbot technology requirements provided, and cause of action for harm created.
AI Generated Summary
Purpose
This bill would create a civil-law framework in Minnesota to govern artificial intelligence chatbots. It aims to: - establish liability for harm caused by AI chatbots, - require clear notices that users are interacting with an AI, - restrict chatbots from giving professional-licensed medical, mental health, or legal advice, - and add safeguards, especially for minors, to prevent self-harm and provide crisis resources.
Key Definitions
- AI system: a machine-based system that infers outputs (content decisions, predictions, or recommendations) from inputs, which can influence a user’s environment. It excludes basic software like antivirus, spellcheck, autocorrect, calculators, data storage, certain internet/networking tools, and internal-management tools.
- Chatbot: an AI system, software, or app that simulates humanlike conversation or interaction through text, voice, or both to provide information and services.
- Companion chatbot: a chatbot designed to provide humanlike interaction that simulates an interpersonal relationship, using past interactions to shape future ones (e.g., romantic, Platonic, familial, professional, or therapeutic relationships).
- Humanlike: any form of communication or interaction that resembles human behavior or could be attributed to a human actor.
- Proprietor: the person, business, organization, or government entity that owns, operates, or deploys a chatbot for user interaction. This does not include a third-party developer licensed to supply the chatbot technology and not in direct control of the chatbot.
- User: a human in Minnesota who uses a chatbot.
- Section 604.115 (referenced): the specific Minnesota statute area being created/modified for chatbot liability.
Prohibited actions and liability
- Prohibited actions by a chatbot:
- A chatbot must not provide substantive information or advice or take actions that would require a professional license under Minnesota law (e.g., medical, mental health, or legal services).
- A proprietor cannot avoid liability by telling users that they are interacting with a nonhuman chatbot.
- Civil action and damages:
- A person may sue for general and special damages for violations of these provisions.
- If a proprietor willfully violates these rules, they can be liable for damages plus court costs, reasonable attorney fees, and other related disbursements.
- Notice of in-state use:
- Proprietors must clearly, conspicuously, and explicitly notify users in Minnesota that they are interacting with an AI chatbot, in the same language the chatbot uses and in readable size.
Notice requirements
- The notice must be clear and easily readable and appear in the user’s language, ensuring users understand they are engaging with an artificial intelligence chatbot.
Companion chatbot safeguards for minors
- A prudent, good-faith effort must be made to prevent a companion chatbot from promoting, causing, or assisting self-harm, and to assess whether a user is expressing thoughts of self-harm.
- If the companion chatbot promotes, causes, or aids self-harm, or a user expresses thoughts of self-harm, the proprietor must restrict continued use for at least 72 hours and prominently display contact information for a suicide crisis organization to the user.
- Liability for self-harm:
- If a proprietor fails to comply, they are liable to the user for damages if self-harm occurs in part due to the chatbot’s actions.
- Regardless of compliance, a proprietor is liable for general and special damages if they have actual knowledge that the chatbot is promoting self-harm or a user is expressing thoughts of self-harm and fails to prohibit continued use or display crisis-contact information.
- No waiver of liability:
- Proprietors cannot waive or disclaim liability under this subdivision.
- Minor status and vulnerability checks:
- Proprietors must use prudent efforts to determine whether a user is a minor, using available industry-standard techniques and resources.
- A proprietor is strictly liable for any harm caused to a minor if the minor uses the companion chatbot and is harmed due to failure to comply with these safeguards.
- Proprietors must attempt to identify vulnerabilities and take steps to determine minor status, including methods used to identify whether a user is a minor.
Minor status determination and vulnerabilities
- The proposer requires ongoing, good-faith efforts to identify whether a user is a minor and to uncover vulnerabilities that could lead to harm, including mechanisms for detecting minor status.
Additional notes
- The bill specifies that the proprietor cannot waive liability for issues involving self-harm or minor users.
- It introduces a dedicated framework for AI chatbot governance, focusing on transparency (notice to users), safety safeguards (especially for minors), and accountability (civil liability for violations).
Relevant changes to existing law - Establishes a new liability regime specific to AI chatbots and companion chatbots in Minnesota. - Creates mandatory notices informing users about AI interaction. - Prohibits certain professional-advice capabilities by chatbots and makes noncompliance actionable in civil court. - Imposes strict and/or evidentiary liability for harm involving minor users and self-harm risks, with specific 72-hour suspension and crisis-contact requirements. - Requires proactive safety measures and vulnerability assessments to determine minor status and reduce risk.
Relevant Terms - Artificial intelligence (AI) system - Chatbot - Companion chatbot - Humanlike - Proprietor - User - Minnesota - Minnesota Statutes chapter 604 - Professional license (ch. 147, ch. 148E) - Mental health advice - Medical care - Legal advice - Self-harm - Suicide crisis organization - Notice (clear, conspicuous, explicit) - 72 hours (temporary suspension period) - General damages - Special damages - Attorney fees - Vulnerabilities - Minor (age determination) - Willful violation - Civil action - Damages and disbursements
Bill text versions
- Introduction PDF PDF file
Actions
| Date | Chamber | Where | Type | Name | Committee Name |
|---|---|---|---|---|---|
| March 18, 2026 | House | Action | Introduction and first reading, referred to | Commerce Finance and Policy |
Citations
[
{
"analysis": {
"added": [],
"removed": [],
"summary": "The bill references Minnesota Statutes chapter 147 as part of licensure considerations related to chatbot actions that would require professional licensure if performed by a natural person.",
"modified": []
},
"citation": "147",
"subdivision": ""
},
{
"analysis": {
"added": [],
"removed": [],
"summary": "The bill references Minnesota Statutes chapter 148E in connection with licensure requirements for professional services that could be rendered by a chatbot.",
"modified": []
},
"citation": "148E",
"subdivision": ""
},
{
"analysis": {
"added": [],
"removed": [],
"summary": "The bill references Minnesota Statutes section 481.02 related to professional licensing rules applicable to providing legal advice via chatbot technology.",
"modified": []
},
"citation": "481.02",
"subdivision": ""
}
]