Skip to content

[FEATURE] Support Together AI as a provider #511

@yangtheman

Description

@yangtheman

Scope check

  • This is core LLM communication (not application logic)
  • This benefits most users (not just my use case)
  • This can't be solved in application code with current RubyLLM
  • I read the Contributing Guide

Due diligence

  • I searched existing issues
  • I checked the documentation

What problem does this solve?

Together AI is one of the leading inference providers for many open source models, very similar to OpenRouter, except that it does not support closed models. Thus, support for Together AI would expand even wider adoption of Ruby LLM.

Proposed solution

Add together as a provider and only support text type open source models for now. Supporting additional modalities will be added in future PRs to reduce the size of initial PR and to gradually add functionalities.

Why this belongs in RubyLLM

While Together supports HTTP API in complicance of OpenAI's API, if you set openai_api_key and openai_api_base with Together's API key and base, then you can't use OpenAI with Ruby LLM. Adding support for Together allows you to use both open models on Together as well as OpenAI.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions