Skip to content

Commit ae55966

Browse files
author
Aditya Puranik
committed
docs: add guide for remaining LiteLLM work
1 parent 018b7d0 commit ae55966

File tree

1 file changed

+160
-0
lines changed

1 file changed

+160
-0
lines changed

LITELLM_REMAINING_WORK.md

Lines changed: 160 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,160 @@
1+
# LiteLLM Integration - Remaining Work
2+
3+
## Current Status
4+
5+
**Completed**: LiteLLM provider works for **Agent blocks** in workflows.
6+
7+
**Pending**: LiteLLM integration with **Copilot** (sim.ai's AI assistant).
8+
9+
---
10+
11+
## What's Done
12+
13+
- LiteLLM provider implementation (`providers/litellm/`)
14+
- API route for model discovery (`/api/providers/litellm/models`)
15+
- Environment variables (`LITELLM_BASE_URL`, `LITELLM_API_KEY`)
16+
- Full tool execution and streaming support
17+
- Provider registered in store and registry
18+
19+
---
20+
21+
## Remaining: Copilot Integration
22+
23+
The Copilot has a hardcoded model list separate from the provider system. To enable LiteLLM models in the Copilot, modify these files:
24+
25+
### 1. Add LiteLLM to valid provider IDs
26+
27+
**File**: `apps/sim/lib/copilot/config.ts`
28+
29+
Add `'litellm'` to `VALID_PROVIDER_IDS`:
30+
31+
```typescript
32+
const VALID_PROVIDER_IDS = [
33+
'openai',
34+
'azure-openai',
35+
'anthropic',
36+
'google',
37+
'deepseek',
38+
'xai',
39+
'cerebras',
40+
'mistral',
41+
'groq',
42+
'ollama',
43+
'litellm', // ADD THIS
44+
] as const
45+
```
46+
47+
### 2. Update Copilot model options (Frontend)
48+
49+
**File**: `apps/sim/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/copilot/components/user-input/constants.ts`
50+
51+
Option A - Add static LiteLLM entry:
52+
```typescript
53+
export const MODEL_OPTIONS = [
54+
// ... existing options
55+
{ value: 'litellm/default', label: 'LiteLLM', provider: 'litellm' },
56+
]
57+
```
58+
59+
Option B - Make model list dynamic by fetching from provider store.
60+
61+
### 3. Update API validation schema
62+
63+
**File**: `apps/sim/app/api/copilot/chat/route.ts`
64+
65+
Update `ChatMessageSchema` to accept LiteLLM models. Find the `model` field validation and either:
66+
67+
- Add a regex pattern for `litellm/*` models
68+
- Or dynamically validate against available provider models
69+
70+
### 4. Update Copilot state types
71+
72+
**File**: `apps/sim/stores/panel/copilot/types.ts`
73+
74+
Update the `selectedModel` type in `CopilotState` to include LiteLLM model pattern:
75+
76+
```typescript
77+
selectedModel: 'claude-4.5-opus' | 'claude-4.5-sonnet' | /* ... */ | `litellm/${string}`
78+
```
79+
80+
---
81+
82+
## Testing Checklist
83+
84+
After implementing Copilot integration:
85+
86+
- [ ] LiteLLM models appear in Copilot model selector
87+
- [ ] Can select and use LiteLLM model in Copilot chat
88+
- [ ] Streaming works in Copilot with LiteLLM
89+
- [ ] Run `bun run lint && bun run type-check`
90+
91+
---
92+
93+
## Submitting the PR
94+
95+
### 1. Push to your fork
96+
97+
```bash
98+
git push origin feat/litellm-provider
99+
```
100+
101+
### 2. Create Pull Request
102+
103+
Go to: https://github.com/simstudioai/sim/compare
104+
105+
- Click "compare across forks"
106+
- Base repository: `simstudioai/sim`
107+
- Base branch: `staging` (NOT `main`)
108+
- Head repository: `adityapuranik99/sim`
109+
- Compare branch: `feat/litellm-provider`
110+
111+
### 3. PR Template
112+
113+
**Title**: `feat(providers): add LiteLLM provider integration`
114+
115+
**Body**:
116+
```markdown
117+
## Summary
118+
- Add LiteLLM as a new provider for Sim
119+
- Enables connecting LiteLLM proxy to access 100+ LLM providers (including GitHub Copilot)
120+
- Uses OpenAI-compatible API pattern
121+
122+
## Changes
123+
- New provider: `apps/sim/providers/litellm/`
124+
- New API route: `apps/sim/app/api/providers/litellm/models/`
125+
- Environment variables: `LITELLM_BASE_URL`, `LITELLM_API_KEY`
126+
127+
## Test plan
128+
- [ ] Set `LITELLM_BASE_URL` in .env
129+
- [ ] Verify models appear with `litellm/` prefix in Agent block
130+
- [ ] Test chat completion through Agent block
131+
- [ ] Verify streaming works
132+
- [ ] Run `bun run lint && bun run type-check`
133+
134+
## Note
135+
This PR enables LiteLLM for Agent blocks. Copilot integration can be added in a follow-up PR.
136+
```
137+
138+
---
139+
140+
## Environment Setup
141+
142+
To test LiteLLM locally, add to `apps/sim/.env`:
143+
144+
```bash
145+
LITELLM_BASE_URL=http://localhost:4000
146+
LITELLM_API_KEY=sk-your-key # optional
147+
```
148+
149+
Start LiteLLM proxy:
150+
```bash
151+
pip install 'litellm[proxy]'
152+
litellm --model gpt-4o --port 4000
153+
```
154+
155+
---
156+
157+
## Questions?
158+
159+
- Discord: https://discord.gg/Hr4UWYEcTT
160+
- GitHub Issues: https://github.com/simstudioai/sim/issues

0 commit comments

Comments
 (0)