Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
> Modern C++ LLM API client with openai-compatible support

[![C++23](https://img.shields.io/badge/C%2B%2B-23-blue.svg)](https://en.cppreference.com/w/cpp/23)
[![C++17](https://img.shields.io/badge/C%2B%2B-17-blue.svg)](https://en.cppreference.com/w/cpp/17)
[![C API](https://img.shields.io/badge/C_API-ok-green.svg)](https://en.cppreference.com/w/cpp/23)
[![Module](https://img.shields.io/badge/module-ok-green.svg)](https://en.cppreference.com/w/cpp/language/modules)
[![License](https://img.shields.io/badge/license-Apache_2.0-blue.svg)](LICENSE)
Expand All @@ -22,6 +23,7 @@ Clean, type-safe LLM API client using C++23 modules. Fluent interface with zero-
- **Fluent Interface** - Chainable methods
- **C API** - Full C language support with OOP style
- **Provider Agnostic** - OpenAI, Poe, and compatible endpoints
- **C++17 Header-only** - Native support via `#include <llmapi.hpp>` (API identical to C++23)

## Quick Start

Expand Down
2 changes: 2 additions & 0 deletions README.zh.hant.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
> Modern C++ LLM API client with openai-compatible support

[![C++23](https://img.shields.io/badge/C%2B%2B-23-blue.svg)](https://en.cppreference.com/w/cpp/23)
[![C++17](https://img.shields.io/badge/C%2B%2B-17-blue.svg)](https://en.cppreference.com/w/cpp/17)
[![C API](https://img.shields.io/badge/C_API-ok-green.svg)](https://en.cppreference.com/w/cpp/23)
[![Module](https://img.shields.io/badge/module-ok-green.svg)](https://en.cppreference.com/w/cpp/language/modules)
[![License](https://img.shields.io/badge/license-Apache_2.0-blue.svg)](LICENSE)
Expand All @@ -22,6 +23,7 @@
- **流式介面** - 可鏈式呼叫的方法
- **C 語言 API** - 完整的 C 語言支援,物件導向風格
- **提供商無關** - OpenAI、Poe 及相容端點
- **C++17 純標頭檔** - 原生支援 C++17,僅需 `#include <llmapi.hpp>`(API 與 C++23 一致)

## 快速開始

Expand Down
2 changes: 2 additions & 0 deletions README.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
> Modern C++ LLM API client with openai-compatible support

[![C++23](https://img.shields.io/badge/C%2B%2B-23-blue.svg)](https://en.cppreference.com/w/cpp/23)
[![C++17](https://img.shields.io/badge/C%2B%2B-17-blue.svg)](https://en.cppreference.com/w/cpp/17)
[![C API](https://img.shields.io/badge/C_API-ok-green.svg)](https://en.cppreference.com/w/cpp/23)
[![Module](https://img.shields.io/badge/module-ok-green.svg)](https://en.cppreference.com/w/cpp/language/modules)
[![License](https://img.shields.io/badge/license-Apache_2.0-blue.svg)](LICENSE)
Expand All @@ -22,6 +23,7 @@
- **流式接口** - 可链式调用的方法
- **C 语言 API** - 完整的 C 语言支持,面向对象风格
- **提供商无关** - OpenAI、Poe 及兼容端点
- **C++17 纯头文件** - 原生支持 C++17,仅需 `#include <llmapi.hpp>`(API 与 C++23 一致)

## 快速开始

Expand Down
69 changes: 69 additions & 0 deletions examples/cxx17/basic.cpp
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
// Basic usage example - demonstrates both streaming and non-streaming modes
#include "llmapi.hpp"
#include <iostream>
#include <cstdlib>

using namespace mcpplibs;

int main() {
auto api_key = std::getenv("OPENAI_API_KEY");
if (!api_key) {
std::cout << "Error: OPENAI_API_KEY not set" << std::endl;
return 1;
}

llmapi::Client client(api_key, llmapi::URL::Poe);
client.model("gpt-5").system("You are a helpful assistant.");

std::cout << "=== llmapi Basic Usage Demo ===\n" << std::endl;

try {
// Example 1: Non-streaming request
std::cout << "[Example 1] Non-streaming mode:" << std::endl;
std::cout << "Question: What is the capital of China?\n" << std::endl;

client.user("What is the capital of China?");
client.request();

std::cout << "Answer: " << client.getAnswer() << "\n" << std::endl;

// Example 2: Streaming request
std::cout << "[Example 2] Streaming mode:" << std::endl;
std::cout << "Question: Convince me to use modern C++ (100 words)\n" << std::endl;

client.user("Convince me to use modern C++ (100 words)");
std::cout << "Answer: ";

client.request([](std::string_view chunk) {
std::cout << chunk;
std::cout.flush();
});

std::cout << "\n" << std::endl;

// Verify auto-save: get the last answer
auto last_answer = client.getAnswer();
std::cout << "[Verification] Last answer length: " << last_answer.size() << " chars\n" << std::endl;

// Example 3: Translate the story to Chinese
std::cout << "[Example 3] Translation (streaming):" << std::endl;
std::cout << "Question: 请把上个回答翻译成中文。\n" << std::endl;

client.user("请把上面的故事翻译成中文。");
std::cout << "Answer: ";

client.request([](std::string_view chunk) {
std::cout << chunk;
std::cout.flush();
});

std::cout << "\n" << std::endl;

} catch (const std::exception& e) {
std::cout << "\nError: " << e.what() << std::endl;
return 1;
}

std::cout << "=== Demo Complete ===" << std::endl;
return 0;
}
49 changes: 49 additions & 0 deletions examples/cxx17/chat.cpp
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
// Simple and elegant AI chat CLI tool using streaming
#include "llmapi.hpp"
#include <iostream>
#include <cstdlib>

using namespace mcpplibs;

int main() {
auto api_key = std::getenv("OPENAI_API_KEY");
if (!api_key) {
std::cout << "Error: OPENAI_API_KEY not set" << std::endl;
return 1;
}

llmapi::Client client(api_key, llmapi::URL::Poe);
client.model("gpt-5").system("You are a helpful assistant.");

std::cout << "AI Chat CLI - Type 'quit' to exit\n" << std::endl;

while (true) {
std::cout << "You: ";
std::string input;
std::getline(std::cin, input);

if (input == "quit" || input == "q") {
std::cout << "\nBye!" << std::endl;
break;
}

if (input.empty()) continue;

try {
client.user(input);
std::cout << "\nAI: ";

client.request([](std::string_view chunk) {
std::cout << chunk;
std::cout.flush();
});

std::cout << "\n" << std::endl;

} catch (const std::exception& e) {
std::cout << "\nError: " << e.what() << std::endl;
}
}

return 0;
}
20 changes: 20 additions & 0 deletions examples/cxx17/hello_mcpp.cpp
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
// Minimal example - simplest way to use llmapi
#include "llmapi.hpp"
#include <iostream>
#include <cstdlib>

int main() {
using namespace mcpplibs;

llmapi::Client client(std::getenv("OPENAI_API_KEY"), llmapi::URL::Poe);

client.model("gpt-5")
.system("You are a helpful assistant.")
.user("In one sentence, introduce modern C++. 并给出中文翻译")
.request([](std::string_view chunk) {
std::cout << chunk;
std::cout.flush();
});

return 0;
}
17 changes: 17 additions & 0 deletions examples/cxx17/xmake.lua
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
target("cxx17_hello_mcpp")
set_kind("binary")
add_files("hello_mcpp.cpp")
set_languages("c++17")
add_deps("llmapi_cxx17")

target("cxx17_basic")
set_kind("binary")
add_files("basic.cpp")
set_languages("c++17")
add_deps("llmapi_cxx17")

target("cxx17_chat")
set_kind("binary")
add_files("chat.cpp")
set_languages("c++17")
add_deps("llmapi_cxx17")
1 change: 1 addition & 0 deletions examples/xmake.lua
Original file line number Diff line number Diff line change
Expand Up @@ -14,3 +14,4 @@ target("chat")
add_deps("llmapi")

includes("c")
includes("cxx17")
12 changes: 12 additions & 0 deletions include/llmapi.hpp
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
#pragma once

#include <json.hpp>
#include "llmapi/url.hpp"
#include "llmapi/openai.hpp"

namespace mcpplibs::llmapi {
using OpenAI = openai::OpenAI;
using Client = openai::OpenAI;
using URL = llmapi::URL;
using Json = nlohmann::json;
} // namespace mcpplibs::llmapi
Loading