Informed consent section addition

I have added the wording below in my basic informed consent document. Perhaps having this or something similar as a default, as some clinicians may not understand the issues that using AI may present.

5. The use of Artificial Intelligence

Informed disclosure: In order to streamline my workflow and provide focused care while in-session with clients, I sometimes use a HIPAA-compliant AI documentation tool to help draft my progress notes, rather than take notes. If used, the tool processes audio from our session to create a draft that I review, edit, and finalize. The tool’s vendor, Upheal, is bound by a Business Associate Agreement and may not use your information for other purposes. You can opt out at any time; I will then complete notes without AI assistance.

****************************

Some clients may choose to use artificial intelligence (AI)–based platforms, applications, or chatbots that are designed to provide mental health or psychotherapy-related information or support. Please be advised of the following:

• These AI tools are not a substitute for professional psychotherapy or emergency services. Some

AI tools are designed to simulate human interaction in a way that can confuse users. They are not

capable of assessing your individual circumstances, ensuring confidentiality, or providing

personalized clinical care. AI tools may provide inaccurate, incomplete, or inappropriate information. They may provide advice that contradicts or conflicts with my treatment, cause confusion, and be counterproductive.

• Confidentiality and privacy risks: Information you share with AI platforms may not be protected by federal or state confidentiality laws (including HIPAA). Your personal health information could be stored, used, or shared by the company that operates the AI system.

• Misdiagnosis or harm: AI platforms may generate advice or interpretations that are incorrect or

misleading. I am not responsible for any outcomes, decisions, or consequences that result from your use of such platforms.

• Legal prohibition and regulation: Use of AI tools for therapeutic purposes may be prohibited or

regulated by state and federal laws, particularly involving minors.

• Emergency or crisis situations: If you are experiencing a mental health crisis, suicidal thoughts, or

feel unsafe, do not rely on an AI system for help. Instead, contact 988 (Suicide and Crisis Lifeline), call 911, or go to the nearest emergency department. You may also contact me directly through the

procedures outlined in this consent form, but in an emergency, always seek immediate assistance

from the crisis resources above.

By signing this consent form below, you acknowledge that you understand the potential risks of using AI-based mental health platforms on your own, and that such use is entirely voluntary and independent of the professional services I provide. I encourage you to inform me if you are considering using or are using an AI tool for support so we can discuss the potential benefits and risks.

Please authenticate to join the conversation.

Upvoters
Status

In Review

Board
💡

Feature Request

Date

3 days ago

Author

Katerina Philbrick

Subscribe to post

Get notified by email when there are changes.