Pillar · sdk
SDK — the harness that works with any LLM.
Tool calling, MCP, memory, workflows, directives. Swap Anthropic for OpenAI for Ollama without changing app code. The directive library translates one prompt into every model's format. Build once, run anywhere — including the local 7B you'll fine-tune in Q3.
from sagewai import Agent
agent = Agent(model="claude-sonnet-4-6")
reply = agent.run("triage this email", tools=[escalate, draft])
# swap to "ollama/llama3.2" — no other change neededOther pillars
- Autopilot — describe the goal, we design the agents.Goals in, agents out — Curator captures every run
- Fleet — workers, dispatch, scoped routing.24 workers / 100 tasks / 0 cross-tenant leaks (CI-gated)
- Observatory — show your CFO where the AI money goes.Real OTel + Grafana, populated by mixed-tenant load
- Training Loop — from juggernauts to your own model.$0.35 RunPod fine-tune, deployed via Ollama