Skip to content

Support Bailing LLM from ALIPAY #3487

@cuauty

Description

@cuauty

We hope make our LLM named BaiLing become the one of the optional LLM on the chat.lmsys.org and join the chat on the website. We setup our own http end-point for LLM's reasoning and Bailing LLM has already been compatible with openai client and I have passed the test based on the FastChat document on my local environment.

I can setup PR to submit my code to fastchat/serve/api_provider.py. Is that all I need to do? Thank you.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions