ChatLLM is an Android application based on the multimodal LLM inference engine mllm. It supports text and image conversations offline.
Here are some previews of the ChatLLM app in action:
| Model | Chat | Image Chat |
|---|---|---|
| PhoneLM 1.5B | ✔️ | ❌ |
| Qwen1.5 1.8B | ✔️ | ❌ |
| SmolLM 1.7B | ✔️ | ❌ |
| OpenELM 1.1B (Removed) | ✔️ | ❌ |
| Phi-3-Vision 3.8B | ✔️ | ✔️ |
| Phi-3-Vision Finetuned 3.8B | ✔️ | ✔️ |
The model can be found in repository Huggingface. It will be automatically downloaded when loading the model if it not found in phone download storage.
ChatLLM.apk, give permission to manage files.Put the downloaded libmllm_lib.a file into the following directory:
app/src/main/cpp/libs
Get mllm codes:
git clone https://github.com/UbiquitousLearning/mllm
cd mllm
Build mllm_lib:
mkdir ../build-arm-app
cd ../build-arm-app
cmake .. \
-DCMAKE_TOOLCHAIN_FILE=$ANDROID_NDK/build/cmake/android.toolchain.cmake \
-DCMAKE_BUILD_TYPE=Release \
-DANDROID_ABI="arm64-v8a" \
-DANDROID_NATIVE_API_LEVEL=android-28 \
-DNATIVE_LIBRARY_OUTPUT=. -DNATIVE_INCLUDE_OUTPUT=. $1 $2 $3 \
-DARM=ON \
-DAPK=ON \
-DQNN=ON \
-DDEBUG=OFF \
-DTEST=OFF \
-DQUANT=OFF \
-DQNN_VALIDATE_NODE=ON \
-DMLLM_BUILD_XNNPACK_BACKEND=OFF
make mllm_lib -j$(nproc)
Copy mllm_lib to ChatBotApp:
cp ./libmllm_lib.a ChatBotApp/app/src/main/cpp/libs/
Note
ChatLLM credits the MLLM Engine and SaltedFish.