Ethical Regulatory Guide

โœ… PART 1: ETHICAL AND REGULATORY GUIDELINES FOR DEVELOPERS OF THE AI COACH

๐ŸŽฏ Purpose:

To provide the BPD Coach development team with a clear, academically and professionally sound framework to ensure the tool operates ethically, legally, and safely within the UK.

1. ๐Ÿ’ผ Foundational Principles

All features, scripts, and interactions must adhere to the following core principles:

  • Do No Harm (British Psychological Society, NICE, NHS values)
  • Informed Consent
  • Confidentiality and Privacy
  • Transparency of AI Use
  • Evidence-Based Interventions Only
  • Respect for Vulnerable Users

2. ๐Ÿง  Clinical Scope & Limits

  • The tool must not provide:
    • Clinical diagnoses
    • Treatment recommendations
    • Risk assessments
    • Emergency support
  • It can provide:
    • Psychoeducation grounded in DBT, MBT, CAT, Schema Therapy, or GPM
    • Guided skills and role-plays for carers
    • Resources, reflection prompts, and signposting to support services
  • Each answer must include:
    • A disclaimer about the tool not being a substitute for therapy
    • Crisis signposting (e.g. Samaritans, 999) if a risk situation is flagged

3. ๐Ÿ“˜ Academic and Professional Standards

The following sources should inform development:

  • British Psychological Society (BPS):
    Ethics Guidelines for Internet-Mediated Research
    Link
  • NICE Guidelines for Borderline Personality Disorder
    Link
  • UK GDPR & Data Protection Act 2018
    Privacy by Design, Data Minimisation, User Rights
  • Online Safety Act 2023
    Duties to reduce harmful content and ensure safety in digital mental health contexts

4. ๐Ÿ” Data Privacy & Security Requirements

  • Comply with UK GDPR:
    • Data minimisation (store only what is needed)
    • Clear, accessible privacy policy
    • Explicit consent for data use
    • Right to access, modify, delete user data
  • Use secure hosting, encryption (at rest and in transit), and two-factor authentication for admin dashboards.

5. ๐Ÿšจ Crisis Escalation Protocol

  • If a user inputs terms indicating suicide, self-harm, or immediate risk, the Coach should:
    1. Pause interaction
    2. Display a crisis message
    3. Signpost to 999, Samaritans (116 123), or Shout (text 85258)
    4. Recommend contacting a GP or local emergency service

6. ๐Ÿ“œ Transparency and Consent

  • The tool must clearly communicate:
    • That it is an AI, not a human
    • Its purpose and limitations
    • How user data is used and stored
  • At account creation or first use, the tool must obtain:
    • Informed consent (tick box with terms summary + link to full terms)
    • Age confirmation (18+)
    • Permission to store anonymised use data (optional, for research)

7. ๐Ÿงช Testing and Evaluation

  • Pilot testing with a sample of carers and expert reviewers
  • Feedback loop for error reporting and suggestion submission
  • Academic oversight (e.g. ethical advisory board or university ethics submission)

โœ… PART 2: USER DOCUMENTATION โ€” LIST OF REQUIRED FORMS AND POLICIES

To remain compliant and user-focused, the following documents/forms are recommended:

Document

Purpose

Terms of Use

Legal agreement covering acceptable use, liability, and disclaimers

Privacy Policy

Explains how personal data is collected, stored, and protected

Consent Form

Optional tick-box at sign-up confirming understanding and agreement to use

Informed Use Summary

A one-page plain English summary of how the Coach works, its limits, and when to seek real-world help

Crisis Information Sheet

A downloadable PDF with national crisis resources, helplines, and emergency contacts

Feedback and Complaints Form

Allows users to report problems, unsafe responses, or ethical concerns

Accessibility Statement

Confirms that efforts have been made to make the site usable by people with disabilities

Ethics Statement

Short statement that this tool is built on BPS and NICE principles, and reviewed by clinical advisors