Bug Description
Using gpt-5.3-codex-spark with forge leads to out of context errors due to incorrect max context configuration.
Spark is a model hosted on cerebras hardware (the wafer scale inference chips), which have limited on-die VRAM. OpenAI specifies 128k context size for the model here: https://openai.com/index/introducing-gpt-5-3-codex-spark/
Currently Forge reports 272k context for this model, so if it is used Forge eventually exceeds the model's context limit.
Steps to Reproduce
Use the model and work on a project.
Expected Behavior
Should not run out of context.
Actual Behavior
Runs out of context.
Forge Version
2.9.8
Operating System & Version
No response
AI Provider
OpenAI
Model
gpt-5.3-codex-spark
Installation Method
Other
Configuration
Bug Description
Using gpt-5.3-codex-spark with forge leads to out of context errors due to incorrect max context configuration.
Spark is a model hosted on cerebras hardware (the wafer scale inference chips), which have limited on-die VRAM. OpenAI specifies 128k context size for the model here: https://openai.com/index/introducing-gpt-5-3-codex-spark/
Currently Forge reports 272k context for this model, so if it is used Forge eventually exceeds the model's context limit.
Steps to Reproduce
Use the model and work on a project.
Expected Behavior
Should not run out of context.
Actual Behavior
Runs out of context.
Forge Version
2.9.8
Operating System & Version
No response
AI Provider
OpenAI
Model
gpt-5.3-codex-spark
Installation Method
Other
Configuration