A Node.js CLI that uses Ollama and LM Studio models (Llava, Gemma, Llama etc.) to intelligently rename files by their contents
-
Updated
Feb 9, 2025 - JavaScript
A Node.js CLI that uses Ollama and LM Studio models (Llava, Gemma, Llama etc.) to intelligently rename files by their contents
visionOS examples ⸺ Spatial Computing Accelerators for Apple Vision Pro
Efficient visual programming for AI language models
LLMX; Easiest 3rd party Local LLM UI for the web!
MESH-AI — Off-Grid AI + Mesh Router for Meshtastic - Seamlessly connect LM Studio, Ollama, OpenAI, 3rd-party APIs, & Home Assistant to your LoRa mesh. Supports custom commands, Twilio SMS (inbound/outbound), Discord channel routing, & GPS emergency alerts via SMS, email, or Discord + SO MUCH MORE!
The Deepseek API wrapper for Delphi leverages Deepseek’s advanced models to deliver powerful capabilities for seamless and dynamic conversational interactions, including a model optimized for reasoning, and now also supports running local models through an LM Studio server.
The GenAI API wrapper for Delphi seamlessly integrates OpenAI’s latest models (o4, gpt-4.1 and gpt-5), delivering robust support for agent chats/responses, text generation, vision, audio analysis, JSON configuration, web search, asynchronous operations, and video (SORA-2, SORA-2-pro). Image generation with gpt-image-1.
Serverless single HTML page access to an OpenAI API compatible Local LLM
Soupy is a Discord bot that uses Flux, and LM Studio. It chats and functions as an image generator for your users, and has other fun features.
Your offline AI coding assistant in the terminal using ollama and LM studio
A local browser automation agent based on Microsoft Fara-7B model optimized for LM Studio inference.
AI Persona in JSON file
Mikrotik RouterOS dashboard with built-in AI assistant, analytics and network map. Supports Claude, LM-Studio and Cloudflare AI Agent models.
MCP prompt tool applying Chain-of-Draft (CoD) reasoning - BYOLLM
Some handy tools to do with audio locally.
Add a description, image, and links to the lm-studio topic page so that developers can more easily learn about it.
To associate your repository with the lm-studio topic, visit your repo's landing page and select "manage topics."