Skip to content

Output from OpenAI gpt-oss-20b and gpt-oss-120b is not in the correct format #8037

@vaaale

Description

@vaaale

LocalAI version:
3.9.0

Environment, CPU architecture, OS, and Version:
Not relevant

Describe the bug
The OpenAI gpt-oss-* models is generating output in the "harmony" format. LocalAI does not handle this correctly to be compatible with the OpenAI api. Instead of returning the reasoning in the "reasoning_content" of the response, it's placed in the normal "content" breaking any application trying to use these models.

To Reproduce
Run gpt-oss-20b or gpt-oss120b

Expected behavior
Localai should return the response according to the OpenAI specification.

Logs
N/A

Additional context
N/A

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions