Tagged "prompt-caching"
2 articles
- obi-jam Building Per-Skill LLM Cost Tracking Into Your Agent A JSONL-per-month cost logger that tags every API call by skill and user. Know exactly what each capability costs — not just total spend.
- obi-jam Prompt Caching for Multi-Skill Agents — Split Stable vs Dynamic Your agent's identity doesn't change per request. Mark it for caching. Keep skill instructions in the dynamic block. Save 90% on repeated input tokens.