Module llm_agent

Module llm_agent 

Source
Expand description

LLM provider abstraction.

LlmAgent wraps multiple large-language-model backends behind a single async complete() interface. Supported providers:

  • Anthropic (claude-* models, Messages API)
  • OpenAI (gpt-* and compatible, Chat Completions API)
  • Ollama (local, OpenAI-compatible endpoint)

The active provider and model are selected via LlmConfig.

Structs§

ChatMessage
A single turn in a conversation (role + content).
LlmAgent
An actor that calls an LLM provider and returns completions.
LlmConfig
Configuration for the LLM backend.

Enums§

LlmProvider
Supported LLM provider backends.

Functions§

calc_cost_nano_usd
Calculate cost in nano-USD from token counts and model name.