Blog
Technical articles on inference optimization and building with LLMs.
Use DeepSeek, Kimi, GLM, and other open-source models in Claude Code with Auriko
Point Claude Code at Auriko to access open-source and open-weight models through one API key.
Read postUse DeepSeek, Gemini, Grok, and 250+ models in OpenCode with Auriko
Auriko is a registered provider in OpenCode. One environment variable, 15 models out of the box, 250+ with a config line.
Read postThe variables that affect your inference cost
How provider caching mechanics interact with your workload to determine inference cost, and how to optimize for it.
Read post