Skip to content

Environment Reference

The environment tells pupt-lib about the world your prompt will run in — which LLM will receive it, what platform is executing it, and how the output should be formatted. Components use this information to adapt their rendering automatically.

For user-facing documentation, see the Environment & Context guide.

Quick Start

typescript
import { render, createEnvironment } from 'pupt-lib';

const result = await render(element, {
  env: createEnvironment({
    llm: { model: 'claude-sonnet-4-5' },
    code: { language: 'typescript' },
  }),
});

With this environment, the <Prompt> component uses XML delimiters and positive constraint framing. <Role> renders with the Anthropic-preferred "You are " prefix. And <If provider="anthropic"> conditionals evaluate to true.


Environment Sections

The environment groups its configuration into six sections. Each section controls a different aspect of how pupt-lib renders your prompts:

SectionPurposeExample Fields
llmTarget LLM model and providermodel, provider, temperature
outputOutput formatting preferencesformat, trim, indent
codeCode generation settingslanguage, highlight
userCaller/user contexteditor
runtimeAuto-detected system valueshostname, username, platform, locale
promptPrompt component defaultsincludeRole, delimiter, defaultRole

LLM Configuration

env.llm — describes the target LLM.

typescript
interface LlmConfig {
  model: string;
  provider: LlmProvider;
  maxTokens?: number;
  temperature?: number;
}
FieldTypeDefaultValidationDescription
modelstring'unspecified'Any stringModel ID (e.g., 'claude-sonnet-4-5', 'gpt-4o').
providerLlmProvider'unspecified'Must be a valid providerModel creator. Auto-inferred from model if not set.
maxTokensnumberPositive integerMaximum output tokens.
temperaturenumber0 to 2Sampling temperature.

Supported Providers

ProviderModel Patterns
anthropicclaude*, opus, sonnet, haiku
openaigpt-*, chatgpt-*, o1, o3, o4 (followed by -, _, or end of string)
googlegemini*
metallama*
mistralmistral*, mixtral*, codestral*, pixtral*
deepseekdeepseek*
xaigrok*
coherecommand*
unspecified(default)

Provider Inference

When you specify a model but omit provider, createEnvironment infers the provider automatically by matching the model name against the patterns in the table above. This inference runs as a Zod transform on the llm config, so it happens during validation.

typescript
const env = createEnvironment({ llm: { model: 'claude-sonnet-4-5' } });
console.log(env.llm.provider); // 'anthropic'

const env2 = createEnvironment({ llm: { model: 'gpt-4o' } });
console.log(env2.llm.provider); // 'openai'

If your model name doesn't match any known pattern (for example, a fine-tuned or self-hosted model), set the provider explicitly:

typescript
const env = createEnvironment({
  llm: { model: 'my-custom-model', provider: 'anthropic' },
});

Provider Adaptations

Components consult a provider-specific adaptation table to adjust their rendering. The table below shows the defaults for each provider (defined in PROVIDER_ADAPTATIONS):

ProviderRole PrefixConstraint StyleFormat PreferenceInstruction Style
anthropic"You are "positivexmlstructured
openai"You are "balancedmarkdowndirect
google"Your role: "positivemarkdowndirect
meta"You are "balancedmarkdowndirect
mistral"You are "balancedmarkdowndirect
deepseek"You are "balancedmarkdownstructured
xai"You are "balancedmarkdowndirect
cohere"You are "balancedmarkdowndirect
unspecified"You are "positivemarkdownstructured

Output Configuration

env.output — controls how pupt-lib formats the rendered prompt text.

typescript
interface OutputConfig {
  format: 'xml' | 'markdown' | 'json' | 'text' | 'unspecified';
  trim: boolean;
  indent: string;
}
FieldTypeDefaultDescription
format'xml' | 'markdown' | 'json' | 'text' | 'unspecified''unspecified'Sets the output format. When set to 'markdown', components use heading-style delimiters (## tag) instead of XML-style (<tag>).
trimbooleantrueTrim whitespace from output.
indentstring' ' (2 spaces)Indentation string for indented output.

Code Configuration

env.code — configures language-specific behavior for code-related prompts.

typescript
interface CodeConfig {
  language: string;
  highlight?: boolean;
}
FieldTypeDefaultDescription
languagestring'unspecified'Target language (e.g., 'typescript', 'python').
highlightbooleanRequest syntax highlighting in output.

Language conventions (available via LANGUAGE_CONVENTIONS):

Each entry is an array of convention strings that components can incorporate into rendered output.

LanguageConventions
typescript'Use explicit type annotations', 'Prefer interfaces over type aliases for objects', 'Use async/await over raw promises'
python'Follow PEP 8 style guide', 'Use type hints', 'Prefer list comprehensions where readable'
rust'Use idiomatic Rust patterns', 'Handle errors with Result type', 'Prefer references over cloning'
go'Follow effective Go guidelines', 'Handle errors explicitly', 'Use short variable names in small scopes'
unspecified'Follow language best practices'

User Configuration

env.user — context about the person or system running the prompt.

typescript
interface UserConfig {
  editor: string;
}
FieldTypeDefaultDescription
editorstring'unknown'Editor (e.g., 'vscode', 'cursor', 'vim'). Used by <ReviewFile>.

Runtime Configuration

env.runtime — values that pupt-lib auto-detects from the host system at render time.

typescript
interface RuntimeConfig {
  hostname: string;
  username: string;
  cwd: string;
  platform: string;
  os: string;
  locale: string;
  timestamp: number;
  date: string;
  time: string;
  uuid: string;
}
FieldTypeNode.jsBrowser
hostnamestringos.hostname()'browser'
usernamestringos.userInfo().username'anonymous'
cwdstringprocess.cwd()'/'
platformstring'node''browser'
osstringos.platform() (e.g., 'linux', 'darwin')'unknown'
localestringFrom LANG/LC_ALL/LC_MESSAGES env vars or Intl APInavigator.language
timestampnumberDate.now()Date.now()
datestringISO date YYYY-MM-DDISO date YYYY-MM-DD
timestringISO time HH:MM:SSISO time HH:MM:SS
uuidstringcrypto.randomUUID()crypto.randomUUID()

Each call to render() produces fresh timestamp, date, time, and uuid values. In Node.js, pupt-lib caches system values (hostname, username, os) after the first detection and reuses them on subsequent calls.


Prompt Configuration

env.prompt — tells <Prompt> which default sections to auto-generate and how to delimit them.

typescript
interface PromptConfig {
  includeRole: boolean;
  includeFormat: boolean;
  includeConstraints: boolean;
  includeSuccessCriteria: boolean;
  includeGuardrails: boolean;
  defaultRole: string;
  defaultExpertise: string;
  delimiter: 'xml' | 'markdown' | 'none';
}
FieldTypeDefaultDescription
includeRolebooleantrueAuto-generate <Role> if none provided
includeFormatbooleantrueAuto-generate <Format> if none provided
includeConstraintsbooleantrueAuto-generate <Constraints> if none provided
includeSuccessCriteriabooleanfalseAuto-generate <SuccessCriteria>
includeGuardrailsbooleanfalseAuto-generate <Guardrails>
defaultRolestring'assistant'Role preset key for auto-generated role
defaultExpertisestring'general'Expertise area for auto-generated role
delimiter'xml' | 'markdown' | 'none''xml'Default delimiter style

Functions

createEnvironment(overrides?)

Creates a validated EnvironmentContext by merging partial overrides with DEFAULT_ENVIRONMENT.

typescript
function createEnvironment(overrides?: Partial<EnvironmentContext>): EnvironmentContext

This function shallow-merges your overrides into each section of the default environment, then validates every field against the Zod schemas. If any value fails validation, it throws a ZodError. During validation, pupt-lib also infers the provider from the model name if you haven't set one explicitly.

typescript
// Minimal
const env = createEnvironment({ llm: { model: 'claude-sonnet-4-5' } });

// Multiple sections
const env2 = createEnvironment({
  llm: { model: 'gpt-4o', temperature: 0.5 },
  output: { format: 'markdown' },
  prompt: { includeGuardrails: true },
});

// Invalid values throw
createEnvironment({ llm: { temperature: 5 } });
// ZodError: temperature must be <= 2

inferProviderFromModel(model)

Infers the LLM provider from a model name string. The matching is case-insensitive, so 'Claude-Sonnet-4-5' and 'claude-sonnet-4-5' both return 'anthropic'.

typescript
function inferProviderFromModel(model: string): LlmProvider | null
typescript
inferProviderFromModel('claude-sonnet-4-5');   // 'anthropic'
inferProviderFromModel('gpt-4o');              // 'openai'
inferProviderFromModel('gemini-pro');          // 'google'
inferProviderFromModel('my-fine-tuned-model'); // null

createRuntimeConfig()

Creates a RuntimeConfig with auto-detected system values. The render() function calls this internally on every invocation to capture fresh timestamps and system state.

typescript
function createRuntimeConfig(): RuntimeConfig

ensureRuntimeCacheReady()

Waits for the Node.js runtime cache to finish initializing. In browser environments, this resolves immediately. You'll mainly use this in tests to guarantee that system values like hostname and username are available before assertions run.

typescript
async function ensureRuntimeCacheReady(): Promise<void>

Constants

DEFAULT_ENVIRONMENT

typescript
{
  llm: { model: 'unspecified', provider: 'unspecified' },
  output: { format: 'unspecified', trim: true, indent: '  ' },
  code: { language: 'unspecified' },
  user: { editor: 'unknown' },
  runtime: {},  // auto-populated at render time
  prompt: {
    includeRole: true,
    includeFormat: true,
    includeConstraints: true,
    includeSuccessCriteria: false,
    includeGuardrails: false,
    defaultRole: 'assistant',
    defaultExpertise: 'general',
    delimiter: 'xml',
  },
}

LLM_PROVIDERS

typescript
const LLM_PROVIDERS = [
  'anthropic', 'openai', 'google', 'meta', 'mistral',
  'deepseek', 'xai', 'cohere', 'unspecified',
] as const;

Using the Environment in Components

Class Components

You access the environment through context.env in the render method:

typescript
class PlatformNote extends Component<{ children: PuptNode }> {
  render(props: { children: PuptNode }, _resolved: void, context: RenderContext): PuptNode {
    const { platform } = context.env.runtime;
    const { provider } = context.env.llm;

    return [
      props.children,
      `\n(Running on ${platform}, targeting ${provider})`,
    ];
  }
}

The Component base class provides convenience methods:

typescript
class MyComponent extends Component<{ children: PuptNode }> {
  render(props: { children: PuptNode }, _resolved: void, context: RenderContext): PuptNode {
    const provider = this.getProvider(context);    // e.g., 'anthropic'
    const delimiter = this.getDelimiter(context);  // 'xml' or 'markdown'
    return props.children;
  }
}

Function Components

typescript
function Greeting(props: { name: string }, context?: RenderContext): string {
  const locale = context?.env.runtime.locale ?? 'en-US';

  if (locale.startsWith('es')) {
    return `Hola, ${props.name}!`;
  }
  return `Hello, ${props.name}!`;
}

Released under the MIT License.