Writing Safer Prompts: What to Share with AI Tools and What to Leave Out

AI can help with advocacy writing β€” from structuring complaints to clarifying letters β€” but not every detail belongs in a chat box. Here’s how to use AI tools safely, protect privacy, and care for your emotional wellbeing while getting the help you need.

20 minutes
Safety and Boundaries
Icon not found

AI tools can help advocates summarise stories, draft emails and plan actions β€” but they also raise big questions: β€œIs it safe to paste this in?” β€œHow much detail is too much?” β€œWhat if someone else reads this later?”

This guide offers simple, plain-language ways to write safer prompts, manage privacy, and avoid re-triggering yourself while using AI.

Why Prompt Safety Matters in Advocacy

Advocacy often involves sensitive information β€” from health and disability details to financial stress, housing struggles, or unsafe practices. Sharing too much in an AI tool can lead to unwanted exposure or emotional exhaustion.

  • Privacy: protecting yourself and others from unnecessary exposure.
  • Power: staying in control of what’s shared and what happens next.
  • Emotional load: reducing how much you have to relive or rewrite painful experiences.

A Simple Rule of Thumb

Before you paste or type anything into an AI tool, pause and ask: β€œDo I really need to include this detail for the tool to help me?”

If the answer is β€œno” or β€œI’m not sure,” summarise, anonymise, or leave it out. The goal is usefulness, not exposure.

Safer Prompts: Four Key Ideas

  • De-identify where you can
  • Summarise instead of pasting everything
  • Describe patterns, not full case histories
  • Control the emotional β€œzoom level”

1. De-Identify Where You Can

De-identifying means removing or changing details that could easily reveal who’s involved. Be cautious with:

  • Full names or addresses
  • Emails, phone numbers, case IDs, client numbers
  • Exact school, service or clinic names (especially in small towns)
  • Dates combined with rare circumstances
Instead of: β€œOn 14 March 2025, Dr Sarah Jones at Sunshine Paediatrics wrote a report about my son, Jack Smith.”
Try: β€œIn early 2025, a paediatrician wrote a report about my child at a local clinic.”

The AI can still help you structure your story or letter β€” it doesn’t need the identifying parts.

2. Summarise Instead of Pasting Everything

Instead of pasting full reports, write short summaries like:

β€œThis report recommends more therapy hours but doesn’t mention safety concerns.”

Then ask AI:

β€œHere is a summary of what this report says. Please help me list the main points and draft a few respectful questions I could ask about it.”

You still get support β€” without sharing sensitive attachments.

3. Describe Patterns, Not Full Case Histories

AI doesn’t need every incident to help you describe the big picture. For example:

β€œOver the last two years, I’ve had repeated cancellations and unsafe practices from my support service. Please help me summarise the pattern and explain why it’s a problem.”

Keep your detailed records offline for formal use β€” AI’s role is to help with the wording, not the evidence.

4. Control the Emotional β€œZoom Level”

Choose how close you get to painful details. Think of it like camera distance:

  • Close-up: every detail and quote β€” often overwhelming.
  • Mid-shot: general description with examples.
  • Wide shot: broad overview and impact.

If it’s too much, stay at mid- or wide-shot level and tell AI:

β€œPlease help me describe this situation in general terms so the seriousness comes through without every detail.”

Concrete Prompt Patterns for Safer Use

These are examples you can copy and adapt:

Help me with structure, not content:
β€œI’m writing about a sensitive situation. Please help me structure my letter and headings, but don’t ask for extra detail.”
Rewrite without extra detail:
β€œHere’s my draft paragraph. Please simplify it, keep meaning, and don’t add new facts.”
Anonymise this text:
β€œReplace names with neutral labels like β€˜the service’ or β€˜my child’, without changing meaning.”

Prompts That Protect Your Boundaries

You can set limits in the way you talk to AI:

  • β€œPlease don’t minimise what happened, but keep it short.”
  • β€œDon’t tell me to β€˜see both sides’. Just help me organise my points.”
  • β€œIf my question needs legal or clinical advice, remind me to speak to a human.”

You’re always in charge of the tone and depth of your own prompts.

When It’s Better Not to Use AI at All

Crisis and Immediate Safety

If you’re in danger or in a mental health crisis, reach out to emergency or crisis services, not an AI chat. You can use AI later for summaries, not during the crisis itself.

Legal or Court Matters

AI tools are not legal advisors. Use them only to organise notes or prepare questions for your lawyer β€” never for legal strategy or submissions.

Other People’s Stories Without Consent

Never share others’ stories in full without permission. Summarise or anonymise patterns instead.

Emotional Safety While Prompting

Set limits to protect yourself while using AI:

  • Work in short sessions β€” 15 minutes at a time.
  • Alternate between heavy and light tasks.
  • Notice your body and take breaks if needed.
  • Reach out to humans when things feel too much.

AI can help with writing, but it cannot hold you like people can.

A Small Safer-Prompting Checklist

  • Have I removed names or IDs I don’t need?
  • Could I summarise this instead of pasting everything?
  • Am I describing patterns, not every detail?
  • Is AI the right tool for this situation?
  • Am I emotionally okay to work on this now?

If not, adjust your prompt or come back later.

A Gentle Way to Practise Safer Prompts

  1. Pick a low-stakes example or resolved issue.
  2. Write a detailed version (with fake names).
  3. Practise turning it into a de-identified summary.
  4. Ask AI whether it still understands the situation.

Most of the time, you’ll find that AI can help without you revealing everything.

Safer prompting is not about paranoia β€” it’s about choice. You decide what to share, when to share it, and how close you get to painful details. AI is one tool in your advocacy kit β€” but you hold the pen.