A lightweight Python package for managing and versioning LLM prompt templates.
- Simple JSON-based storage
- Template versioning
- Tag-based organization
- Jinja2 template syntax
- Variable highlighting in Markdown exports for better visibility
- Package integration utilities
pip install promptstorefrom promptstore import PromptStore
# Create a store
store = PromptStore("./prompts")
# Add a prompt template
prompt = store.add(
name="example",
content="Write a {{language}} function that {{task}}",
namespace="project1",
description="Code generation prompt",
tags=["coding", "generation"],
subset="subproject1"
)
# Use the prompt
filled = prompt.fill({
"language": "Python",
"task": "sorts a list in reverse order"
})This automatically generates a markdown representation like this:
---
uuid: 77a6b015-efea-436e-8636-6ae06c438ad8
name: example
namespace: project1
description: Code generation prompt
version: 1
tags:
- coding
- generation
variables:
- language
- task
subset: subproject1
created_at: '2025-11-12T11:17:24.128334+00:00'
updated_at: '2025-11-12T11:17:24.128334+00:00'
---
Write a **`{{language}}`** function that **`{{task}}`**This makes it easy to identify which parts of the prompt are variables when viewing the markdown files.
# Fill a prompt template
prompt = store.get("namespace/name@subset")
result = prompt.fill({
"language": "Python",
"task": "sorts a list in ascending order"
})Full documentation is available at lamalab-org.github.io/promptstore.
This project is licensed under the MIT License - see the LICENSE file for details.