Questions this guide answers
- Why does my verb forget details between messages?
- What is short-term context vs long-term memory?
- What should go into knowledge entries vs memory entries?
- How does URL scraping fill knowledge automatically?
- How do training examples and keywords influence replies?
The four memory layers
1. Conversation context (short-term)
- Uses recent messages from the active conversation.
- Controlled by your
Model Contextsetting. - Higher context improves continuity but costs more tokens.
2. Long-term memory
- Manual memory items your verb can keep using over time.
- Best for durable facts and preferences.
3. Knowledge entries
- Structured lore/reference entries (title, category, content, importance).
- Best for world facts, policies, product facts, and evergreen docs.
4. Training examples
- Input/output examples for style and behavior shaping.
- Keyword matching can prioritize specific examples when relevant.
Where system instructions fit
System instructions are not a memory entry type. They are the persistent behavior policy prompt for the verb. Where to configure:- Dashboard -> Bot -> AI Engine -> Behavior
- Field:
systemInstructions
systemInstructions: up to8000chars
- System instructions: behavior rules, format constraints, refusal/uncertainty policy
- Knowledge entries: factual source material and documentation
- Long-term memory: durable user/world facts
- Training examples: preferred phrasing and style patterns
Long-term memory limits
Per memory entry:content: up to2000charscontext: up to500charsimportance:1..10
autoMemoryEnabled: on/offautoMemoryInstructions: up to2000chars
Auto-memory is selective, not guaranteed on every turn. The system saves when
it detects durable information worth retaining.
Knowledge entry limits
Per entry:title: up to100charscontent: up to8000charscategory: up to50charsimportance:1..10
- Maximum knowledge entries:
50
Training data limits
Per example:input: up to500charsexpected: up to2000chars- Optional keywords: used for relevance matching
URL scraping into knowledge
The knowledge page can scrape a URL and generate a draft entry. Expected result:titlecontentcategory
Session memory by surface
| Surface | Memory keying behavior |
|---|---|
| Public API v1 | Uses session_id per caller + character |
| Discord DM | Scoped to user-DM context |
| Discord server | Scoped to server context |
| App group/DM chat | Scoped to group/DM conversation |
What to store where
Put this in long-term memory
- Stable personal preferences
- Ongoing commitments
- Persistent roleplay relationships
Put this in knowledge entries
- Product facts and policies
- Rulebooks
- Canon lore
- Documentation snippets you want the bot to cite reliably
Put this in training examples
- Desired phrasing style
- Tone and boundary examples
- Repeated Q/A patterns
Common mistakes
Dumping full chats into memory
Dumping full chats into memory
Store compact facts, not raw transcripts. Long noisy entries reduce retrieval quality.
Using knowledge for temporary plans
Using knowledge for temporary plans
Time-sensitive details belong in conversation context, not permanent knowledge.
No keywords in training examples
No keywords in training examples
Add targeted keywords so examples are picked when users ask matching questions.
Oversized auto-memory instructions
Oversized auto-memory instructions
Keep instructions strict and concise. Long broad prompts increase noisy memory writes.

