Skip to content

[Bug]: gpt-5.3-codex-spark has 128k max context not 272k #2963

@curvedinf

Description

@curvedinf

Bug Description

Using gpt-5.3-codex-spark with forge leads to out of context errors due to incorrect max context configuration.

Spark is a model hosted on cerebras hardware (the wafer scale inference chips), which have limited on-die VRAM. OpenAI specifies 128k context size for the model here: https://openai.com/index/introducing-gpt-5-3-codex-spark/

Image

Currently Forge reports 272k context for this model, so if it is used Forge eventually exceeds the model's context limit.

Steps to Reproduce

Use the model and work on a project.

Expected Behavior

Should not run out of context.

Actual Behavior

Runs out of context.

Forge Version

2.9.8

Operating System & Version

No response

AI Provider

OpenAI

Model

gpt-5.3-codex-spark

Installation Method

Other

Configuration

Metadata

Metadata

Assignees

No one assigned

    Labels

    type: bugSomething isn't working.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions