An AI-powered Bluesky content moderation system that automatically labels replies to monitored accounts using LLM-based content analysis.
This project monitors Bluesky posts in real-time via Jetstream and uses an LLM (via any OpenAI-compatible completions API) to classify replies to watched accounts. It automatically applies labels such as "bad-faith", "off-topic", and "funny" to help users filter and moderate content.
The system supports multiple AI providers including LM Studio (local), OpenAI, Claude (Anthropic), and any other OpenAI-compatible API.
The system consists of two components:
- Go Consumer - Monitors the Jetstream firehose and analyzes replies using an LLM
- Skyware Labeler - TypeScript server that manages and emits content labels
Jetstream → Go Consumer → Completions API (LLM) → Labeler Service → Bluesky
- The Go consumer subscribes to Jetstream and monitors replies to specified accounts
- When a reply is detected, it fetches the parent post and sends both to your completions API
- The LLM classifies the reply based on the system prompt
- Labels are emitted via the Skyware labeler service
- Labels are propagated to Bluesky's labeling system
- An OpenAI-compatible completions API endpoint:
- LM Studio (local, free)
- OpenAI (cloud, paid)
- Claude/Anthropic (cloud, paid)
- Any other OpenAI-compatible provider
- A Bluesky account for the labeler
git clone https://github.com/haileyok/dontshowmethis.git
cd dontshowmethisgo mod downloadcd labeler
yarn install
cd ..Copy the example environment file and configure it:
cp .env.example .envFor the Go Consumer:
PDS_URL- Your Bluesky PDS URL (e.g.,https://bsky.social)ACCOUNT_HANDLE- Your Bluesky account handleACCOUNT_PASSWORD- Your Bluesky account passwordWATCHED_OPS- Comma-separated list of DIDs to monitor for replies and emit labels forWATCHED_LOG_OPS- Comma-separated list of DIDs to monitor for replies but not emit labels for. Will use SQLite to keep a logLOGGED_LABELS- Comma-separated list of labels that will be logged to the SQLite databaseJETSTREAM_URL- Jetstream WebSocket URL (default:wss://jetstream2.us-west.bsky.network/subscribe)LABELER_URL- URL of your labeler service (e.g.,http://localhost:3000)LABELER_KEY- Authentication key for the labeler APICOMPLETIONS_API_HOST- Completions API host (e.g.,http://localhost:1234for LM Studio,https://api.openai.comfor OpenAI,https://api.anthropic.comfor Claude)COMPLETIONS_ENDPOINT_OVERRIDE- (Optional) Override the API endpoint path. Required for Claude (/v1/messages). Defaults to/v1/chat/completionsif not specifiedCOMPLETIONS_API_KEY- (Optional) API key for providers that require authentication (OpenAI, Claude, etc.)COMPLETIONS_API_KEY_TYPE- (Optional) API key authentication type. Eitherbearer(for OpenAI) orx-api-key(for Claude)MODEL_NAME- Model name to use (default:google/gemma-3-27b)LOG_DB_NAME- The name of the SQLite db to log toLOG_NO_LABELS- (Optional) When enabled, logs posts with no labels as "no-labels" to the database (does not emit labels)
For the Skyware Labeler:
SKYWARE_DID- Your labeler's DIDSKYWARE_SIG_KEY- Your labeler's signing keyEMIT_LABEL_KEY- Secret key for the emit label API (must matchLABELER_KEYabove)
The system supports multiple AI providers. Configure the appropriate environment variables based on your provider:
Using LM Studio (Local):
- Open LM Studio
- Load a compatible model (recommended:
google/gemma-3-27bor similar) - Start the local server (usually runs on
http://localhost:1234) - Configure:
COMPLETIONS_API_HOST=http://localhost:1234 MODEL_NAME=google/gemma-3-27b # No API key required for local LM Studio
Using OpenAI:
Configure the following in your .env:
COMPLETIONS_API_HOST=https://api.openai.com
COMPLETIONS_API_KEY=sk-proj-...
COMPLETIONS_API_KEY_TYPE=bearer
MODEL_NAME=gpt-4o-mini # or gpt-4o, gpt-3.5-turbo, etc.
# COMPLETIONS_ENDPOINT_OVERRIDE not needed (uses default /v1/chat/completions)Using Claude (Anthropic):
Configure the following in your .env:
COMPLETIONS_API_HOST=https://api.anthropic.com
COMPLETIONS_ENDPOINT_OVERRIDE=/v1/messages
COMPLETIONS_API_KEY=sk-ant-...
COMPLETIONS_API_KEY_TYPE=x-api-key
MODEL_NAME=claude-3-5-sonnet-20241022 # or other Claude modelsUsing other OpenAI-compatible APIs: Most providers use the same configuration as OpenAI (bearer token auth):
COMPLETIONS_API_HOST=https://your-provider-api.com
COMPLETIONS_API_KEY=your-api-key
COMPLETIONS_API_KEY_TYPE=bearer
MODEL_NAME=provider-model-namecd labeler
npm startThe labeler will start two servers:
- Port 14831: Skyware labeler server
- Port 3000: Label emission API
go run .Or build and run:
go build -o dontshowmethis
./dontshowmethisOnce running, the system will:
- Connect to Jetstream and monitor the firehose
- Watch for replies to accounts specified in
WATCHED_OPS - Automatically analyze and label qualifying replies
- Log all actions to stdout
To monitor specific accounts, you need their DIDs. You can find a DID by:
curl "https://bsky.social/xrpc/com.atproto.identity.resolveHandle?handle=username.bsky.social"Add the returned DID to your WATCHED_OPS environment variable.
The system uses a structured prompt to classify content. See lmstudio.go:147 for the system prompt.
.
├── main.go # CLI setup and consumer initialization
├── handle_post.go # Post handling and labeling logic
├── lmstudio.go # Completions API client and content classification
├── sets/
│ └── domains.go # Political domain list (currently unused)
├── labeler/
│ ├── index.ts # Skyware labeler service
│ └── package.json # Labeler dependencies
├── .env.example # Example environment configuration
└── README.md # This file
-
Add the label constant in
main.go:const LabelNewLabel = "new-label"
-
Add it to the labeler's allowed labels in
labeler/index.ts:const LABELS: Record<string, boolean> = { 'bad-faith': true, 'off-topic': true, 'funny': true, 'new-label': true, // Add here }
-
Update the LLM schema in
lmstudio.goto include the new classification -
Update the handling logic in
handle_post.goto emit the new label
MIT