Skip to content

[BUG] Stop does not work on OpenAI Compatible API Provider #10779

@BluePeer

Description

@BluePeer

Problem (one or two sentences)

I use a llama.cpp router configured setup.
config in roo code use it as OpenAI Compatible Api Provider
If you send a Promt you are currently not able to Stop the Inference on server if you Press the UI Stop Button.

It looks like a U / Roo Code Internal issue, not sending the request to the llama.cpp server.
the General OpenAI Api Timeout (On long Promt Processings for example) works, the Roo Code emit a Stop, and the Server Stop the Processing and Restart with the fresh Inited new Processing.
But on Press the Stop button, Roo Code does not Emit the stop to the API and leave the llama.cpp server with a Useless running Inference. tested with different backends like LM Studio and llama.cpp, shoes alltime the same case timeout related stop requests from Roo Code Internals, stop the Inference Processing, but Manual Stop Button Press Not.

Context (who is affected and when)

  • OpenAI Compatible API Provider (Custom Base URL)
  • Multiple Backends (Only tested with llama.cpp and LM Studio)

Reproduction steps

Send a Promt to a OpenAI Compatible custom Base URL and press the Stop Button on Chat Input.
The UI Shows stop, the Inference still run (Not get a Stop request)

Expected result

emit the Stop request to the API

Actual result

inference still run, the API get no stop request.

Variations tried (optional)

No response

App Version

3.41.1

API Provider (optional)

OpenAI Compatible

Model Used (optional)

any models

Roo Code Task Links (optional)

No response

Relevant logs or errors (optional)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    Status

    Triage

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions