Skip to content

Conversation

@rizalibnu
Copy link

@rizalibnu rizalibnu commented Jan 27, 2026

Description

Adds a new bare React Native example application demonstrating LLM chat functionality using Executorch. This example provides a complete reference implementation for running local LLM inference on mobile devices through a simple chat interface.

Key features:

  • Local LLM inference using Executorch
  • Simple chat UI with message history
  • Model loading and inference pipeline
  • Error handling and user feedback
  • Compatible with both iOS and Android

Introduces a breaking change?

  • Yes
  • No

Type of change

  • Bug fix (change which fixes an issue)
  • New feature (change which adds functionality)
  • Documentation update (improves or adds clarity to existing documentation)
  • Other (chores, tests, code style improvements etc.)

Tested on

  • iOS
  • Android

Testing instructions

  1. Navigate to the example directory:

cd apps/bare_rn

  1. Install dependencies:

yarn install

  1. Run on iOS:

npx pod-install
yarn ios

Or run on Android:

yarn android

  1. Verify the app launches and displays the chat interface

  2. Test message sending and model inference (requires model file setup)

Screenshots


Screenshot 2026-01-28 at 02 06 29

Related issues

This PR provides an example app for PR #759

Checklist

  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have updated the documentation accordingly
  • My changes generate no new warnings

Additional notes

This example app was generated using npx @react-native-community/cli@latest init bare_rn --version 0.81.5 --pm yarn and follows bare React Native project structure (not Expo). It serves as a foundational example for developers to understand how to integrate Executorch for on-device LLM inference in their React Native applications.

Why this example is not included in the yarn workspace:

The bare React Native example is maintained outside the yarn workspace structure due to fundamental architectural differences and specific technical issues:

  1. Native Module Resolution Issues with Background Downloader:
    Using the workspace monorepo breaks the Android app's integration with @kesha-antonov/react-native-background-downloader. The monorepo's package hoisting and workspace resolution interferes with the native module's ability to properly register and resolve its native components. This causes runtime failures when attempting to download AI models in the background, which is a critical feature for this LLM chat example.

  2. Dependency Isolation: Bare React Native projects have distinct native dependency chains (iOS Pods, Android Gradle) that conflict with the monorepo's package management. The monorepo uses workspaces and hoisting strategies optimized for JavaScript/TypeScript packages, which can interfere with native module resolution.

rizalibnu and others added 20 commits January 23, 2026 09:07
Add modular resource fetcher adapters to support both Expo and bare React Native environments.

## New Packages

### @rn-executorch/expo-adapter
- Expo-based resource fetcher using expo-file-system
- Supports asset bundles, local files, and remote downloads
- Download management with pause/resume/cancel capabilities

### @rn-executorch/bare-adapter
- Bare React Native resource fetcher using RNFS and background downloader
- Supports all platform-specific file operations
- Background download support with proper lifecycle management

## Core Changes

- Refactor ResourceFetcher to use adapter pattern
- Add initExecutorch() and cleanupExecutorch() for adapter management
- Export adapter interfaces and utilities
- Update LLM controller to support new resource fetching

## App Updates

- Update computer-vision, llm, speech-to-text, text-embeddings apps
- Add adapter initialization to each app
- Update dependencies to use workspace packages
Add a complete bare React Native example app demonstrating LLM integration with react-native-executorch.

## App: llm_bare

### Features
- Simple chat UI for LLM interactions
- Model loading with progress indicator
- Real-time streaming responses
- Send/stop generation controls
- Auto-scrolling message history

### Stack
- **Framework**: React Native 0.81.5 (bare/CLI)
- **LLM**: Uses LLAMA3_2_1B_SPINQUANT model
- **Adapter**: @rn-executorch/bare-adapter
- **Dependencies**: Minimal deps, only essential packages

### Platform Configuration

#### iOS
- Bridging header for RNBackgroundDownloader
- Background URL session handling in AppDelegate
- Background modes (fetch, processing)
- Xcode project configuration

#### Android
- Required permissions for background downloads
- Foreground service configuration
- Network state access
- Proper manifest configuration

### Infrastructure
- Babel configuration for export namespace transform

This serves as a reference implementation for using react-native-executorch in bare React Native environments (non-Expo).
@rizalibnu rizalibnu force-pushed the examples/bare-react-native branch from c27a355 to bb65951 Compare January 28, 2026 03:10
@rizalibnu
Copy link
Author

rizalibnu commented Jan 28, 2026

This PR is ready for review, but I’d prefer to proceed after #759 is merged, since this PR includes changes from #759, which makes the diff quite large.

Alternatively, you can review this commit bb65951, which contains the changes after initializing the bare React Native app using @react-native-community/cli@latest

@rizalibnu rizalibnu marked this pull request as ready for review January 28, 2026 03:15
@msluszniak msluszniak added the feature PRs that implement a new feature label Jan 28, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feature PRs that implement a new feature

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants