TinyLLM-Android

ChatLLM / TinyLLM on Android

ChatLLM is an Android application based on the multimodal LLM inference engine mllm. It supports text and image conversations offline.


GitHub Repo LINK Star this Repo

Preview

Here are some previews of the ChatLLM app in action:

Preview 1 Preview 2 Preview 3 Preview 4

Download APK

Download the Release APK

Supported Models and Functions

Model Chat Image Chat
PhoneLM 1.5B ✔️
Qwen1.5 1.8B ✔️
SmolLM 1.7B ✔️
OpenELM 1.1B (Removed) ✔️
Phi-3-Vision 3.8B ✔️ ✔️
Phi-3-Vision Finetuned 3.8B ✔️ ✔️

The model can be found in repository Huggingface. It will be automatically downloaded when loading the model if it not found in phone download storage.


How to Use on Android

  1. Install and open the ChatLLM.apk, give permission to manage files.
  2. Select the models in the settings menu.
  3. Use the Image Reader or Chat options.
  4. Wait for the model to be downloaded before starting conversations.

How to Run on Android Studio

Step 1: Download the Library

Download libmllm_lib.a.

Step 2: Place the Library

Put the downloaded libmllm_lib.a file into the following directory:

app/src/main/cpp/libs

Step 3: Build and Run on Android Studio

How to Build Manually

Build JNI Lib

Get mllm codes:

git clone https://github.com/UbiquitousLearning/mllm
cd mllm

Build mllm_lib:

mkdir ../build-arm-app
cd ../build-arm-app

cmake .. \
-DCMAKE_TOOLCHAIN_FILE=$ANDROID_NDK/build/cmake/android.toolchain.cmake \
-DCMAKE_BUILD_TYPE=Release \
-DANDROID_ABI="arm64-v8a" \
-DANDROID_NATIVE_API_LEVEL=android-28  \
-DNATIVE_LIBRARY_OUTPUT=. -DNATIVE_INCLUDE_OUTPUT=. $1 $2 $3 \
-DARM=ON \
-DAPK=ON \
-DQNN=ON \
-DDEBUG=OFF \
-DTEST=OFF \
-DQUANT=OFF \
-DQNN_VALIDATE_NODE=ON \
-DMLLM_BUILD_XNNPACK_BACKEND=OFF

make mllm_lib -j$(nproc)

Copy mllm_lib to ChatBotApp:

cp ./libmllm_lib.a ChatBotApp/app/src/main/cpp/libs/

Note
ChatLLM credits the MLLM Engine and SaltedFish.