Bug Description
prepareCallSettings() in packages/core/src/core/prompt/prepare-call-settings.ts defaults temperature to 0 when not explicitly set:
// TODO v5 remove default 0 for temperature
temperature: temperature != null ? temperature : 0,
This causes hard failures with models that restrict temperature values. For example, Moonshot's Kimi K2.5 only accepts temperature=1 and rejects any other value with:
400 Bad Request: "invalid temperature: only 1 is allowed for this model"
Since most users don't explicitly set temperature (expecting the model's own default), every agent.stream() / agent.generate() call silently forces temperature=0, which breaks these models.
Reproduction
import { Agent } from '@mastra/core/agent';
import { createOpenAI } from '@ai-sdk/openai';
const provider = createOpenAI({
baseURL: 'https://api.moonshot.cn/v1',
apiKey: '...',
});
const agent = new Agent({
name: 'test',
model: provider.chat('kimi-k2-5'),
instructions: 'You are a helpful assistant.',
});
// Fails: 400 "invalid temperature: only 1 is allowed for this model"
const result = await agent.stream('Hello');
Expected Behavior
When temperature is not set by the user, it should be undefined so that the AI SDK provider omits it from the request body, letting the model use its own default.
Proposed Fix
One-line change — the TODO comment already exists in the code:
- // TODO v5 remove default 0 for temperature
- temperature: temperature != null ? temperature : 0,
+ temperature,
This is safe because:
- AI SDK providers (
@ai-sdk/openai, etc.) correctly strip undefined fields from requests
- Users who explicitly set
temperature: 0 still get deterministic behavior
- Only the "didn't set anything" case changes: from forced
0 to model default
- Aligns with how every other parameter (
topP, topK, presencePenalty, etc.) is already handled in the same function — none of them have a forced default
Current Workaround
We intercept the fetch call and strip temperature: 0 from the request body before sending, which is fragile and shouldn't be necessary.
Environment
@mastra/core: 1.22.0
- Model: Kimi K2.5 (Moonshot AI) via
@ai-sdk/openai compatible provider
I'm happy to submit a PR for this if the team is open to it — it's a one-line change.
Bug Description
prepareCallSettings()inpackages/core/src/core/prompt/prepare-call-settings.tsdefaultstemperatureto0when not explicitly set:This causes hard failures with models that restrict temperature values. For example, Moonshot's Kimi K2.5 only accepts
temperature=1and rejects any other value with:Since most users don't explicitly set temperature (expecting the model's own default), every
agent.stream()/agent.generate()call silently forcestemperature=0, which breaks these models.Reproduction
Expected Behavior
When
temperatureis not set by the user, it should beundefinedso that the AI SDK provider omits it from the request body, letting the model use its own default.Proposed Fix
One-line change — the TODO comment already exists in the code:
This is safe because:
@ai-sdk/openai, etc.) correctly stripundefinedfields from requeststemperature: 0still get deterministic behavior0to model defaulttopP,topK,presencePenalty, etc.) is already handled in the same function — none of them have a forced defaultCurrent Workaround
We intercept the fetch call and strip
temperature: 0from the request body before sending, which is fragile and shouldn't be necessary.Environment
@mastra/core: 1.22.0@ai-sdk/openaicompatible providerI'm happy to submit a PR for this if the team is open to it — it's a one-line change.