Context
OpenClaw v2026.3.22 now passes modelId to context-engine assemble(). We accept it for compatibility (#6) but don't use it yet.
Opportunity
Use modelId to adapt context assembly per model:
- Token budget scaling — pack more memories for models with larger context windows (e.g., 200K vs 8K)
- Format adaptation — some models handle structured markdown better than others
- Embedding-aware retrieval — adjust
minScore thresholds based on known model characteristics
Scope
packages/apps/openclaw/src/context-engine.ts — assemble() method
- Potentially
packages/core/src/components/context.ts — pack() could accept model hints
Priority
Low — current behavior works fine. This is an optimization for when we have data on how different models respond to different context formats.
Context
OpenClaw v2026.3.22 now passes
modelIdto context-engineassemble(). We accept it for compatibility (#6) but don't use it yet.Opportunity
Use
modelIdto adapt context assembly per model:minScorethresholds based on known model characteristicsScope
packages/apps/openclaw/src/context-engine.ts—assemble()methodpackages/core/src/components/context.ts—pack()could accept model hintsPriority
Low — current behavior works fine. This is an optimization for when we have data on how different models respond to different context formats.