Skip to content

Remove hardcoded temperature=0 default in prepareCallSettings — breaks models like Kimi K2.5 #15240

@hikariming

Description

@hikariming

Bug Description

prepareCallSettings() in packages/core/src/core/prompt/prepare-call-settings.ts defaults temperature to 0 when not explicitly set:

// TODO v5 remove default 0 for temperature
temperature: temperature != null ? temperature : 0,

This causes hard failures with models that restrict temperature values. For example, Moonshot's Kimi K2.5 only accepts temperature=1 and rejects any other value with:

400 Bad Request: "invalid temperature: only 1 is allowed for this model"

Since most users don't explicitly set temperature (expecting the model's own default), every agent.stream() / agent.generate() call silently forces temperature=0, which breaks these models.

Reproduction

import { Agent } from '@mastra/core/agent';
import { createOpenAI } from '@ai-sdk/openai';

const provider = createOpenAI({
  baseURL: 'https://api.moonshot.cn/v1',
  apiKey: '...',
});

const agent = new Agent({
  name: 'test',
  model: provider.chat('kimi-k2-5'),
  instructions: 'You are a helpful assistant.',
});

// Fails: 400 "invalid temperature: only 1 is allowed for this model"
const result = await agent.stream('Hello');

Expected Behavior

When temperature is not set by the user, it should be undefined so that the AI SDK provider omits it from the request body, letting the model use its own default.

Proposed Fix

One-line change — the TODO comment already exists in the code:

- // TODO v5 remove default 0 for temperature
- temperature: temperature != null ? temperature : 0,
+ temperature,

This is safe because:

  • AI SDK providers (@ai-sdk/openai, etc.) correctly strip undefined fields from requests
  • Users who explicitly set temperature: 0 still get deterministic behavior
  • Only the "didn't set anything" case changes: from forced 0 to model default
  • Aligns with how every other parameter (topP, topK, presencePenalty, etc.) is already handled in the same function — none of them have a forced default

Current Workaround

We intercept the fetch call and strip temperature: 0 from the request body before sending, which is fragile and shouldn't be necessary.

Environment

  • @mastra/core: 1.22.0
  • Model: Kimi K2.5 (Moonshot AI) via @ai-sdk/openai compatible provider

I'm happy to submit a PR for this if the team is open to it — it's a one-line change.

Metadata

Metadata

Assignees

No one assigned

    Labels

    AI SDKIssues with Mastra + AI-SDK toolkit and providersAgentsIssues regarding Mastra's Agent primitivebugSomething isn't workingeffort:lowimpact:hightrio-tbtrio-wp

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions