What you're building
A Claude Code skill that connects to Proton Mail through Bridge's local IMAP server, scans your inbox by sender frequency, and batch-trashes emails you approve. Every action is logged to a CSV. The skill follows Anthropic's skill spec — folder with a SKILL.md, scripts, and reference docs — so Claude loads it automatically when you mention inbox cleanup.
The underlying pattern is standard IMAP and Python
imaplib, so it can be adapted for other providers (Gmail, Fastmail, etc.). But this recipe was built and tested specifically against Proton Mail Bridge, including the quirks that come with it.
Prerequisites
- Proton Mail Bridge installed and running
- A Proton Mail account (free or paid)
- Python 3 (stdlib only — no pip installs)
- Claude Code
- Environment variables set in your shell profile:
bash# ~/.zshrc or ~/.bashrc
export PROTON_USERNAME="you@protonmail.com"
export PROTON_PASSWORD="bridge-generated-password" # from Bridge app UI
export PROTON_IMAP_PORT="1143"
export PROTON_SECURITY="STARTTLS"
The Bridge password is not your Proton account password. Open the Bridge app, click your account, and copy the generated IMAP/SMTP password.
Build the skill
Folder structure
your-project/
└── .claude/skills/protonmail-imap/
├── SKILL.md # Skill definition with YAML frontmatter
├── scripts/
│ └── imap_connect.py # Reusable connection helper
└── references/
└── bridge-quirks.md # IMAP quirks for your provider
SKILL.md
The frontmatter is what Claude reads first to decide whether to load the skill. The description must include trigger phrases — what the user might say that should activate this skill.
yaml---
name: protonmail-imap
description: Manages email inbox via IMAP through Proton Mail Bridge.
Scans senders, batch-deletes newsletters, searches messages, and
organizes folders. Use when user mentions "inbox", "clean up email",
"delete emails", "scan senders", "proton mail", or "email cleanup".
Requires Proton Mail Bridge running locally.
compatibility: Requires Proton Mail Bridge desktop app. Python 3 with
imaplib (stdlib). Environment variables PROTON_USERNAME,
PROTON_PASSWORD, PROTON_IMAP_PORT, PROTON_SECURITY must be set.
metadata:
author: Solutions Cay
version: 1.0.0
---
The body of SKILL.md contains the step-by-step instructions Claude follows. Structure it as numbered steps with clear decision points:
- Verify Bridge is running (
ps aux | grep proton) - Connect via the helper script
- Determine the operation: scan, cleanup, search, or organize
- Execute with resilience rules
- Report results and log location
Keep the body focused on workflow. Move provider-specific details to references/.
The connection helper
scripts/imap_connect.py handles auth and TLS. Bridge runs on localhost with a self-signed cert, so you skip hostname verification.
pythonimport imaplib
import ssl
import os
def connect():
username = os.environ['PROTON_USERNAME']
password = os.environ['PROTON_PASSWORD']
port = int(os.environ.get('PROTON_IMAP_PORT', '1143'))
security = os.environ.get('PROTON_SECURITY', 'STARTTLS')
ctx = ssl.create_default_context()
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE
if security == 'SSL':
mail = imaplib.IMAP4_SSL('127.0.0.1', port, ssl_context=ctx)
else:
mail = imaplib.IMAP4('127.0.0.1', port)
mail.starttls(ssl_context=ctx)
mail.login(username, password)
return mail
The CERT_NONE is safe here because Bridge is on localhost — you're not skipping verification over a network.
The scan operation
The first thing the skill does is count senders across every message in the inbox. This gives you the data to decide what to trash.
pythonimport email
from collections import Counter
mail = connect()
mail.select('INBOX')
status, messages = mail.search(None, 'ALL')
msg_ids = messages[0].split()
sender_counts = Counter()
batch_size = 100
for i in range(0, len(msg_ids), batch_size):
batch = b','.join(msg_ids[i:i+batch_size])
status, data = mail.fetch(batch, '(BODY.PEEK[HEADER.FIELDS (FROM)])')
for item in data:
if isinstance(item, tuple):
raw = item[1].decode(errors='replace')
msg = email.message_from_string(raw)
from_header = msg.get('From', '')
if '<' in from_header:
addr = from_header.split('<')[1].split('>')[0].lower()
else:
addr = from_header.strip().lower()
sender_counts[addr] += 1
for sender, count in sender_counts.most_common(40):
print(f" {count:4d} {sender}")
mail.logout()
BODY.PEEK is important — it fetches headers without marking messages as read. Batch by 100 to avoid timeouts on large inboxes.
The output looks like this:
316 newsletter@dailydigest.example.com 196 community@forum.example.com 182 alerts@mybank.example.com 135 no-reply@notifications.example.com 95 shipping@store.example.com 88 hello@productnews.example.com
From here, Claude presents the senders in tiers — obvious trash (newsletters, marketing), probably trash (notifications), and keep (receipts, personal, work). The user approves which tiers to delete.
The cleanup operation
Once the user approves a sender list, the skill trashes emails one sender at a time. Every action is logged to CSV before the message is deleted.
pythonimport csv
import time
from datetime import datetime
from email.header import decode_header
def decode_mime(s):
if not s:
return ""
try:
parts = decode_header(s)
return ' '.join(
p.decode(c or 'utf-8', errors='replace')
if isinstance(p, bytes) else p
for p, c in parts
)
except:
return str(s)
TRASH_SENDERS = [
"newsletter@dailydigest.example.com",
"community@forum.example.com",
"no-reply@notifications.example.com",
# ... user-approved list
]
with open('cleanup_log.csv', 'a', newline='') as f:
w = csv.writer(f)
for addr in TRASH_SENDERS:
mail = connect()
mail.select('INBOX')
status, data = mail.search(None, f'(FROM "{addr}")')
if status != 'OK' or not data[0]:
mail.logout()
continue
ids = data[0].split()
count = 0
batch_size = 25
for i in range(0, len(ids), batch_size):
batch = ids[i:i+batch_size]
if i > 0:
try: mail.logout()
except: pass
time.sleep(0.5)
mail = connect()
mail.select('INBOX')
for mid in batch:
try:
st, fd = mail.fetch(
mid,
'(BODY.PEEK[HEADER.FIELDS (FROM SUBJECT DATE)])'
)
from_h = subj_h = date_h = ''
if fd and fd[0] and isinstance(fd[0], tuple):
msg = email.message_from_string(
fd[0][1].decode(errors='replace')
)
from_h = decode_mime(msg.get('From', ''))
subj_h = decode_mime(msg.get('Subject', ''))
date_h = msg.get('Date', '')
mail.copy(mid, 'Trash')
mail.store(mid, '+FLAGS', '\\Deleted')
w.writerow([
datetime.now().isoformat(),
'moved_to_trash',
addr, from_h, subj_h, date_h,
mid.decode()
])
count += 1
except Exception:
try: mail.logout()
except: pass
time.sleep(1)
mail = connect()
mail.select('INBOX')
mail.expunge()
mail.logout()
Why batch by 25 with reconnects
Proton Mail Bridge drops IMAP connections during long operations. After roughly 30-50 sequential commands on a single connection, it stops responding. The fix: expunge and reconnect every 25 messages. The half-second sleep between reconnects gives Bridge time to sync.
Why copy-then-delete
pythonmail.copy(mid, 'Trash') # safety net first
mail.store(mid, '+FLAGS', '\\Deleted') # mark for deletion
mail.expunge() # remove from INBOX
Never delete without copying to Trash first. IMAP expunge is not reversible. The copy gives you a way back if you trash something you shouldn't have. The CSV log tells you exactly what was moved and when.
Stale message IDs
IMAP sequence numbers shift after every expunge. Message 500 becomes message 499 if message 1 was deleted. The reconnect-per-batch pattern sidesteps this — each fresh connection gets a consistent view. When a stale ID does surface, the try/except catches the BAD [no such message] response, reconnects, and moves on.
The CSV audit trail
Every trashed email is logged before deletion:
timestamp,action,from_addr,from_display,subject,date,imap_uid 2026-04-21T15:43:20,moved_to_trash,newsletter@example.com,"Example Newsletter",Weekly Digest #42,Wed 28 Jan 2026,1639 2026-04-21T15:43:21,moved_to_trash,newsletter@example.com,,Special Offer Inside,Tue 27 Jan 2026,1671
Always append, never overwrite. If you run multiple cleanup passes (you will — see below), the log accumulates across all runs. You can grep it later to find anything you trashed.
Multiple passes
IMAP servers that sync incrementally (Proton Bridge, Exchange) won't show you the full inbox on the first connection. Bridge in particular starts with recent messages and backfills older ones over 10-20 minutes. Your first scan might show 2,000 messages. Twenty minutes later, there are 12,000.
Run the scan-approve-trash cycle multiple times:
- Pass 1: Scan, approve tiers, trash. Bridge shows 2K messages, you trash 600.
- Pass 2: Re-scan. Bridge synced more — now 10K. Same senders reappear. Trash 2,200 more.
- Pass 3: Re-scan. 8K left, new senders surface. Add them to the list, trash 2,000.
- Pass 4: Count stabilizes. Done.
The skill handles this naturally. Each pass is idempotent — if a sender has 0 messages, it skips.
Use cases beyond inbox cleanup
The scan/act/log pattern applies to more than trashing newsletters:
- Email archiving. Scan by date range, move messages older than N months to an Archive folder, log what was moved.
- Attachment extraction. Search for messages with attachments, download them to a local folder, log the mapping.
- Sender allowlist enforcement. Keep a list of approved senders. Anything not on the list gets moved to a review folder.
- Automated label/folder sorting. Match senders or subjects to rules, move to the right folder. A poor man's server-side filter.
- Compliance export. Search for messages matching a pattern, export headers and bodies to structured files for legal review.
Each of these is a new operation in the skill's SKILL.md — same connection helper, same CSV logging, same resilience pattern.
What's next
- SMTP integration. Add outbound email via Bridge's SMTP port (1025). The skill could draft and send replies.
- Scheduled runs. Use Claude Code's cron triggers to run the scan weekly and surface a cleanup report.
- Deduplication. Find and merge duplicate messages (same sender, subject, date) that accumulate from mailing list re-deliveries.
- Size analysis. Scan by message size to find large attachments consuming mailbox quota.
