NeuralSnapNeuralSnap/Docs

Memories API

Store, search, and manage raw memories — facts, decisions, insights, stories, frameworks, preferences, and events. Memories are the building blocks of knowledge, captured with emotional context and speaker attribution.

Authentication

All endpoints require a Bearer token from your Supabase session. For API key access, use the /v1/ endpoints.

Store a Memory

POST/memories/remember

Store a new memory in your brain.

Request Body

PropertyTypeDescription
brain_id*stringUUID of the brain to store the memory in.
content*stringThe memory content — what you want to remember.
memory_typeenumOne of: fact, decision, insight, story, framework, preference, event. Default: fact.
speakerstringWho said or created this memory.
tagsstring[]Tags for categorization and filtering.
source_emotionstringDominant emotion tied to this memory (e.g. surprise, frustration, clarity).
emotional_valencenumberEmotional valence from -1 (negative) to 1 (positive).
emotional_intensitynumberEmotional intensity from 0 (neutral) to 1 (extreme).
bash
curl -X POST https://api.neuralsnap.ai/memories/remember \
-H "Authorization: Bearer YOUR_SESSION_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"brain_id": "your-brain-uuid",
"content": "SOPs produce 10x better agent output than prompts",
"memory_type": "decision",
"speaker": "Sefy",
"tags": ["agents", "operations"],
"source_emotion": "clarity",
"emotional_valence": 0.8,
"emotional_intensity": 0.7
}'
POST/memories/search

Semantic search across all memories in a brain.

Request Body

PropertyTypeDescription
brain_id*stringUUID of the brain to search.
query*stringNatural language search query.
memory_typestringFilter by memory type.
limitnumberMax results. Default: 10.
bash
curl -X POST https://api.neuralsnap.ai/v1/memories/search \
-H "Authorization: Bearer ns_test_abc123" \
-H "Content-Type: application/json" \
-d '{
"brain_id": "your-brain-uuid",
"query": "how to make AI agents reliable",
"memory_type": "decision",
"limit": 5
}'

Process Conversation

POST/memories/process/conversation

Extract memories from a conversation transcript automatically.

Request Body

PropertyTypeDescription
brain_id*stringUUID of the brain to store extracted memories.
content*stringThe conversation or transcript text.
source_typestringSource: conversation, fathom, fireflies. Default: conversation.
source_titlestringTitle or description of the conversation.

Token Usage

Processing a conversation uses tokens proportional to the input length. A typical 30-minute meeting transcript (~5000 words) uses approximately 8K-12K tokens.

Memory Types

NeuralSnap supports seven memory types. Choosing the right type improves search relevance and helps you build a well-organized knowledge base. Here's when to use each:

📌fact

An objective piece of information — URLs, numbers, configurations, names.

Use for: API endpoints, team member IDs, server ports, meeting times, reference data.

Example: "The production database runs on port 5432 at db.example.com"

⚖️decision

A choice that was made, with context on why.

Use for: architectural choices, hiring decisions, strategy pivots, tool selections.

Example: "We chose Supabase over Firebase because of row-level security and Postgres compatibility"

💡insight

A realization or pattern recognition — the "aha" moment.

Use for: lessons learned, pattern discoveries, counter-intuitive findings.

Example: "Users who complete onboarding in <2 minutes have 3x higher retention"

📖story

A narrative or anecdote that illustrates a point.

Use for: customer stories, team experiences, historical context, case studies.

Example: "When we launched v1, the landing page crashed under load because we forgot to enable CDN caching"

🧠framework

A mental model or structured way of thinking about something.

Use for: decision-making models, evaluation criteria, processes.

Example: "The ICE framework: score ideas by Impact × Confidence × Ease, pick the highest"

🎯preference

A stated preference — how someone likes things done.

Use for: communication styles, workflow preferences, tool choices, formatting rules.

Example: "Sefy prefers mobile-first design — always check mobile before desktop"

📅event

Something that happened — a milestone, meeting, or occurrence.

Use for: launches, incidents, meetings, milestones, deadlines.

Example: "NeuralSnap launched on Product Hunt on January 15, 2025 — reached #3 for the day"

Default type

If you don't specify a memory_type, it defaults to fact. When processing conversations automatically, NeuralSnap infers the best type for each extracted memory.