@output.ai/llm package provides a unified API for LLM generation with built-in prompt templating.
Installation
Quick Start
Generate Functions
generateText
Generate unstructured text:generateObject
Generate a structured object matching a Zod schema:generateArray
Generate an array of structured items:generateEnum
Generate a value from allowed options:Prompt Files
Prompt files use YAML frontmatter for configuration and LiquidJS for templating.File Format
prompt@v1.prompt
Configuration Options
| Option | Type | Description |
|---|---|---|
provider | string | anthropic, openai, or azure |
model | string | Model identifier |
temperature | number | Sampling temperature (0.0-1.0) |
maxTokens | number | Maximum output tokens |
providerOptions | object | Provider-specific options |
Variables
Use LiquidJS syntax to interpolate variables:Extended Thinking
Enable extended thinking for complex reasoning:Providers
Anthropic
ANTHROPIC_API_KEY environment variable.
OpenAI
OPENAI_API_KEY environment variable.