如何在 (Android Studio) NDK (C/C++ API) 中运行 Tensorflow-Lite 推理?
- 我从 Keras 构建了一个 Tensorflow (TF) 模型并将其转换为 Tensorflow-Lite (TFL)
- 我在 Android Studio 中构建了一个 Android 应用,并使用 Java API 来运行 TFL 模型
- 在 Java 应用程序中,我使用了 TFL 支持库(参见 此处),以及来自 JCenter 的 TensorFlow Lite AAR,方法是在我的
build.gradle 下包含
implementation 'org.tensorflow:tensorflow-lite:+'
依赖
- I built a Tensorflow (TF) model from Keras and converted it to Tensorflow-Lite (TFL)
- I built an Android app in Android Studio and used the Java API to run the TFL model
- In the Java app, I used the TFL Support Library (see here), and the TensorFlow Lite AAR from JCenter by including
implementation 'org.tensorflow:tensorflow-lite:+'
under mybuild.gradle
dependencies
推理时间不是很好,所以现在我想在 Android 的 NDK 中使用 TFL.
Inference times are not so great, so now I want to use TFL in Android's NDK.
所以我在 Android Studio 的 NDK 中构建了 Java 应用程序的精确副本,现在我尝试在项目中包含 TFL 库.我遵循 TensorFlow-Lite 的 Android 指南 并在本地构建了 TFL 库(并获得了一个AAR 文件),并将该库包含在我在 Android Studio 中的 NDK 项目中.
So I built an exact copy of the Java app in Android Studio's NDK, and now I'm trying to include the TFL libs in the project. I followed TensorFlow-Lite's Android guide and built the TFL library locally (and got an AAR file), and included the library in my NDK project in Android Studio.
现在我试图在我的 C++ 文件中使用 TFL 库,尝试在代码中 #include
它,但我收到一条错误消息:cannot find tensorflow
(或我尝试使用的任何其他名称,根据我在 CMakeLists.txt
文件中给它的名称).
Now I'm trying to use the TFL library in my C++ file, by trying to #include
it in code, but I get an error message: cannot find tensorflow
(or any other name I'm trying to use, according to the name I give it in my CMakeLists.txt
file).
应用build.gradle:
apply plugin: 'com.android.application'
android {
compileSdkVersion 29
buildToolsVersion "29.0.3"
defaultConfig {
applicationId "com.ndk.tflite"
minSdkVersion 28
targetSdkVersion 29
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
externalNativeBuild {
cmake {
cppFlags ""
}
}
ndk {
abiFilters 'arm64-v8a'
}
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
// tf lite
aaptOptions {
noCompress "tflite"
}
externalNativeBuild {
cmake {
path "src/main/cpp/CMakeLists.txt"
version "3.10.2"
}
}
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation 'androidx.appcompat:appcompat:1.1.0'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'
// tflite build
compile(name:'tensorflow-lite', ext:'aar')
}
项目build.gradle:
buildscript {
repositories {
google()
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:3.6.2'
}
}
allprojects {
repositories {
google()
jcenter()
// native tflite
flatDir {
dirs 'libs'
}
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
CMakeLists.txt:
cmake_minimum_required(VERSION 3.4.1)
add_library( # Sets the name of the library.
native-lib
# Sets the library as a shared library.
SHARED
# Provides a relative path to your source file(s).
native-lib.cpp )
add_library( # Sets the name of the library.
tensorflow-lite
# Sets the library as a shared library.
SHARED
# Provides a relative path to your source file(s).
native-lib.cpp )
find_library( # Sets the name of the path variable.
log-lib
# Specifies the name of the NDK library that
# you want CMake to locate.
log )
target_link_libraries( # Specifies the target library.
native-lib tensorflow-lite
# Links the target library to the log library
# included in the NDK.
${log-lib} )
native-lib.cpp:
#include <jni.h>
#include <string>
#include "tensorflow"
extern "C" JNIEXPORT jstring JNICALL
Java_com_xvu_f32c_1jni_MainActivity_stringFromJNI(
JNIEnv* env,
jobject /* this */) {
std::string hello = "Hello from C++";
return env->NewStringUTF(hello.c_str());
}
class FlatBufferModel {
// Build a model based on a file. Return a nullptr in case of failure.
static std::unique_ptr<FlatBufferModel> BuildFromFile(
const char* filename,
ErrorReporter* error_reporter);
// Build a model based on a pre-loaded flatbuffer. The caller retains
// ownership of the buffer and should keep it alive until the returned object
// is destroyed. Return a nullptr in case of failure.
static std::unique_ptr<FlatBufferModel> BuildFromBuffer(
const char* buffer,
size_t buffer_size,
ErrorReporter* error_reporter);
};
进展
我也尝试遵循这些:
Progress
I also tried to follow these:
- 使用问题Android Studio 项目中的 tensorflow lite C++ API
- Android C++ NDK:一些共享库拒绝在运行时链接
- 如何将 TensorFlow Lite 构建为静态库并从单独的 (CMake) 项目链接到它?
- 如何设置Tensorflow Lite C++的输入
- 我该怎么做只构建 TensorFlow lite 而不是从源代码构建所有 TensorFlow?
但就我而言,我使用 Bazel 来构建 TFL 库.
but in my case I used Bazel to build the TFL libs.
尝试构建(label_image),我设法构建它并将 adb push
发送到我的设备,但是在尝试运行时出现以下错误:
Trying to build the classification demo of (label_image), I managed to build it and adb push
to my device, but when trying to run I got the following error:
ERROR: Could not open './mobilenet_quant_v1_224.tflite'.
Failed to mmap model ./mobilenet_quant_v1_224.tflite
- 我关注了zimenglyu 的帖子:试图在
WORKSPACE
中设置android_sdk_repository
/android_ndk_repository
给我一个错误:WORKSPACE:149:1:无法在 WORKSPACE 文件中的任何加载语句之后重新定义存储库(对于存储库androidsdk")
,并且将这些语句放置在不同的位置会导致相同的错误. - 我删除了对
WORKSPACE
的这些更改并继续 zimenglyu 的帖子:我编译了libtensorflowLite.so
,并编辑了CMakeLists.txt
所以引用了libtensorflowLite.so
文件,但保留了FlatBuffer
部分.Android 项目编译成功,但没有明显变化,我仍然无法包含任何 TFLite 库. - I followed zimenglyu's post: trying to set
android_sdk_repository
/android_ndk_repository
inWORKSPACE
got me an error:WORKSPACE:149:1: Cannot redefine repository after any load statement in the WORKSPACE file (for repository 'androidsdk')
, and locating these statements at different places resulted in the same error. - I deleted these changes to
WORKSPACE
and continued with zimenglyu's post: I've compiledlibtensorflowLite.so
, and editedCMakeLists.txt
so that thelibtensorflowLite.so
file was referenced, but left theFlatBuffer
part out. The Android project compiled successfully, but there was no evident change, I still can't include any TFLite libraries.
尝试编译 TFL,我添加了一个 cc_binary
到 tensorflow/tensorflow/lite/BUILD
(遵循 label_image 示例):
Trying to compile TFL, I added a cc_binary
to tensorflow/tensorflow/lite/BUILD
(following the label_image example):
cc_binary(
name = "native-lib",
srcs = [
"native-lib.cpp",
],
linkopts = tflite_experimental_runtime_linkopts() + select({
"//tensorflow:android": [
"-pie",
"-lm",
],
"//conditions:default": [],
}),
deps = [
"//tensorflow/lite/c:common",
"//tensorflow/lite:framework",
"//tensorflow/lite:string_util",
"//tensorflow/lite/delegates/nnapi:nnapi_delegate",
"//tensorflow/lite/kernels:builtin_ops",
"//tensorflow/lite/profiling:profiler",
"//tensorflow/lite/tools/evaluation:utils",
] + select({
"//tensorflow:android": [
"//tensorflow/lite/delegates/gpu:delegate",
],
"//tensorflow:android_arm64": [
"//tensorflow/lite/delegates/gpu:delegate",
],
"//conditions:default": [],
}),
)
并尝试为 x86_64
和 arm64-v8a
构建它,我收到一个错误:cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' 不包含 cpu 'x86_64' 的工具链
.
and trying to build it for x86_64
, and arm64-v8a
I get an error: cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'x86_64'
.
在第 47 行检查 external/local_config_cc/BUILD
(提供错误):
Checking external/local_config_cc/BUILD
(which provided the error) in line 47:
cc_toolchain_suite(
name = "toolchain",
toolchains = {
"k8|compiler": ":cc-compiler-k8",
"k8": ":cc-compiler-k8",
"armeabi-v7a|compiler": ":cc-compiler-armeabi-v7a",
"armeabi-v7a": ":cc-compiler-armeabi-v7a",
},
)
这些是唯一找到的 2 个 cc_toolchain
.在存储库中搜索cc-compiler-",我只找到了aarch64",我认为它适用于 64 位 ARM,但没有找到x86_64".不过有x64_windows" - 我在 Linux 上.
and these are the only 2 cc_toolchain
s found. Searching the repository for "cc-compiler-" I only found "aarch64", which I assumed is for the 64-bit ARM, but nothing with "x86_64". There are "x64_windows", though - and I'm on Linux.
尝试像这样使用 aarch64 构建:
Trying to build with aarch64 like so:
bazel build -c opt --fat_apk_cpu=aarch64 --cpu=aarch64 --host_crosstool_top=@bazel_tools//tools/cpp:toolchain //tensorflow/lite/java:tensorflow-lite
导致错误:
ERROR: /.../external/local_config_cc/BUILD:47:1: in cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'aarch64'
在 Android Studio 中使用库:
通过更改构建配置中的 soname
并使用 CMakeLists.txt
中的完整路径,我能够为 x86_64
架构构建库.这导致了 .so
共享库.此外 - 通过调整 aarch64_makefile.inc
文件,我能够使用 TFLite Docker 容器为 arm64-v8a
构建库,但我没有更改任何构建选项,并让 build_aarch64_lib.sh
无论它构建什么.这导致了 .a
静态库.
Using the libraries in Android Studio:
I was able to build the library for x86_64
architecture by changing the soname
in build config and using full paths in CMakeLists.txt
. This resulted in a .so
shared library. Also - I was able to build the library for arm64-v8a
using the TFLite Docker container, by adjusting the aarch64_makefile.inc
file, but I did not change any build options, and let build_aarch64_lib.sh
whatever it builds. This resulted in a .a
static library.
所以现在我有两个 TFLite 库,但我仍然无法使用它们(例如,我不能#include "..."
任何东西).
So now I have two TFLite libs, but I'm still unable to use them (I can't #include "..."
anything for example).
尝试构建项目时,仅使用 x86_64
工作正常,但尝试包含 arm64-v8a
库会导致 ninja 错误:'.../libtensorflow-lite.a',需要'.../app/build/intermediates/cmake/debug/obj/armeabi-v7a/libnative-lib.so',缺少并且没有已知的规则来制作它
.
When trying to build the project, using only x86_64
works fine, but trying to include the arm64-v8a
library results in ninja error: '.../libtensorflow-lite.a', needed by '.../app/build/intermediates/cmake/debug/obj/armeabi-v7a/libnative-lib.so', missing and no known rule to make it
.
- 我在 Android Studio 中创建了一个 Native C++ 项目
- 我从 Tensorflow 的
lite
目录中获取了基本的 C/C++ 源文件和头文件,并在app/src/main/cpp
中创建了一个类似的结构,其中我包括 (A) tensorflow、(B) absl 和 (C) flatbuffers 文件 - 我将所有 tensorflow 头文件中的
#include "tensorflow/...
行更改为相对路径,以便编译器可以找到它们. - 在应用程序的
build.gradle
中,我为.tflite
文件添加了一个无压缩行:aaptOptions { noCompress "tflite" }
- 我在应用中添加了一个
assets
目录 - 在
native-lib.cpp
中,我添加了 一些示例代码来自 TFLite 网站 - 尝试构建包含源文件的项目(构建目标是
arm64-v8a
).
- I created a Native C++ project in Android Studio
- I took the basic C/C++ source files and headers from Tensorflow's
lite
directory, and created a similar structure inapp/src/main/cpp
, in which I include the (A) tensorflow, (B) absl and (C) flatbuffers files - I changed the
#include "tensorflow/...
lines in all of tensorflow's header files to relative paths so the compiler can find them. - In the app's
build.gradle
I added a no-compression line for the.tflite
file:aaptOptions { noCompress "tflite" }
- I added an
assets
directory to the app - In
native-lib.cpp
I added some example code from the TFLite website - Tried to build the project with the source files included (build target is
arm64-v8a
).
我收到一个错误:
/path/to/Android/Sdk/ndk/20.0.5594570/toolchains/llvm/prebuilt/linux-x86_64/sysroot/usr/include/c++/v1/memory:2339: error: undefined reference to 'tflite::impl::Interpreter::~Interpreter()'
在
中,第2339行是"delete __ptr;"
行:
in <memory>
, line 2339 is the "delete __ptr;"
line:
_LIBCPP_INLINE_VISIBILITY void operator()(_Tp* __ptr) const _NOEXCEPT {
static_assert(sizeof(_Tp) > 0,
"default_delete can not delete incomplete type");
static_assert(!is_void<_Tp>::value,
"default_delete can not delete incomplete type");
delete __ptr;
}
问题
如何在 Android Studio 中包含 TFLite 库,以便从 NDK 运行 TFL 推理?
Question
How can I include the TFLite libraries in Android Studio, so I can run a TFL inference from the NDK?
或者 - 我如何使用 gradle(目前使用 cmake)来构建和编译源文件?
Alternatively - how can I use gradle (currently with cmake) to build and compile the source files?
我通过以下方式将 Native TFL 与 C-API 结合使用:
I use Native TFL with C-API in the following way:
- 下载最新版本的 TensorFlow Lite AAR 文件
- 将下载的
.arr
文件的文件类型改为.zip
并解压得到共享库(.so
文件) - 从TFL 存储库
- 在 Android Studio 中创建一个 Android C++ 应用
- 在
创建
并在其中创建架构子目录(例如jni
目录(New
->Folder
->JNI Folder
)>app/src/mainarm64-v8a
或x86_64
) - 将所有头文件放在
jni
目录下(架构目录旁边),共享库放在架构目录/ies中 - 打开
CMakeLists.txt
文件并包含 TFL 库的add_library
节,set_target_properties
节中共享库的路径和include_directories
节中的标题(见下文,在 NOTES 部分) - 同步摇篮
- Download the latest version of TensorFlow Lite AAR file
- Change the file type of downloaded
.arr
file to.zip
and unzip the file to get the shared library (.so
file) - Download all header files from the
c
directory in the TFL repository - Create an Android C++ app in Android Studio
- Create a
jni
directory (New
->Folder
->JNI Folder
) inapp/src/main
and also create architecture sub-directories in it (arm64-v8a
orx86_64
for example) - Put all header files in the
jni
directory (next to the architecture directories), and put the shared library inside the architecture directory/ies - Open the
CMakeLists.txt
file and include anadd_library
stanza for the TFL library, the path to the shared library in aset_target_properties
stanza and the headers ininclude_directories
stanza (see below, in NOTES section) - Sync Gradle
用法:
在 native-lib.cpp
中包含头文件,例如:
USAGE:
In native-lib.cpp
include the headers, for example:
#include "../jni/c_api.h"
#include "../jni/common.h"
#include "../jni/builtin_ops.h"
可以直接调用TFL函数,例如:
TFL functions can be called directly, for example:
TfLiteModel * model = TfLiteModelCreateFromFile(full_path);
TfLiteInterpreter * interpreter = TfLiteInterpreterCreate(model);
TfLiteInterpreterAllocateTensors(interpreter);
TfLiteTensor * input_tensor =
TfLiteInterpreterGetInputTensor(interpreter, 0);
const TfLiteTensor * output_tensor =
TfLiteInterpreterGetOutputTensor(interpreter, 0);
TfLiteStatus from_status = TfLiteTensorCopyFromBuffer(
input_tensor,
input_data,
TfLiteTensorByteSize(input_tensor));
TfLiteStatus interpreter_invoke_status = TfLiteInterpreterInvoke(interpreter);
TfLiteStatus to_status = TfLiteTensorCopyToBuffer(
output_tensor,
output_data,
TfLiteTensorByteSize(output_tensor));
注意:
- 在此设置中使用了 SDK 版本 29
-
cmake
环境还包括cppFlags -frtti -fexceptions"
- In this setup SDK version 29 was used
-
cmake
environment also includedcppFlags "-frtti -fexceptions"
NOTES:
CMakeLists.txt
示例:
set(JNI_DIR ${CMAKE_CURRENT_SOURCE_DIR}/../jni)
add_library(tflite-lib SHARED IMPORTED)
set_target_properties(tflite-lib
PROPERTIES IMPORTED_LOCATION
${JNI_DIR}/${ANDROID_ABI}/libtfl.so)
include_directories( ${JNI_DIR} )
target_link_libraries(
native-lib
tflite-lib
...)