Real Time Object Detection

Steadforce
4 min readFeb 19, 2021

Automatic Object Detection based on Deep Learning has the potential to make a significant contribution to areas such as process monitoring in industrial manufacturing, driver assistance systems or health diagnostics support in the future. However, to enable large-scale industrial use, it is necessary to be able to use these methods on resource-limited devices. With the described application we show a possible use which already enables Real Time Object Detection with excellent quality on Android devices.

The advance of Machine learning and Artificial Intelligence brought revolutionary methods in computer vision and text analysis. One of the newer trends is Deep Learning which enables the machine learning model to learn more complex features in data using multiple layers. One term often mentioned in that context is artificial neural networks (ANN). These artificial neural networks can — among others — be used for image classification and even yet to detect multiple objects within an image. Recent improvements allow to run deep learning algorithms on mobile devices and to perform Real Time Object Detection. In the following paragraphs we provide a brief overview on how to deploy a real time mobile object detector on an android phone.

Setting up your environment

We will use Docker to build the Android application as it provides a convenient way to install all required dependencies without causing potential issues on your host machine. To do so we must modify a provided Dockerfile from the TensorFlow Git repository. The Dockerfile is based on the official TensorFlow Docker image and additionally provides dependencies and configurations that are required to build the Java based Android Package (APK). If you do not want to train a model in the Google cloud or convert a pretrained model to the TensorFlow Lite format you can remove the corresponding tasks from the Dockerfile.

Detection Model

We can use a MobileNet model in our Android application, which has been pre-trained on the COCO dataset. A variety of pretrained frozen MobileNet models can be obtained from the TensorFlow Git repository. Moreover, different pre-trained model architectures and frameworks such as SSD MobileNetv1, SSD MobileNetv2 or Faster R-CNN can be downloaded from the TensorFlow detection model zoo. The MobileNet models are low-latency and low-power models designed for the resource constraints of mobile devices. If you want to train your own model it is required to convert the trained TensorFlow frozen graph to TensorFlow Lite format to use them on a mobile device. TensorFlow provides a converter that allows to transform a TensorFlow protocol buffer graph (*.pb) into a TensorFlow Lite FlatBuffer file (*.tflite).

Building the APK

To run the model on a mobile Android device we must first incorporate it in an APK. The Dockerfile provides the Android NDK (for C and C++ support) and SDK required for the build process. Please note that an NDK version compatible with Bazel must be chosen if you wish to use a newer version than 14b which is currently set in the Dockerfile. Next, we will integrate the model into the TensorFlow Lite Android demo app which requires build tools with API level ≥23. Likewise, the SDK tools and configurations are provided in the Dockerfile. API level 23 corresponds to Android 6.0 Marshmallow, however the Android demo will run on devices with API level ≥21. If you want to use a newer version for example to enable the Neural Networks API (API level 27, Android 8.1) you can modify the Dockerfile accordingly. A list of API levels and corresponding Android versions can be found here.

To build the APK with your chosen model you can rename the converted *.tflite file to detect.tflite and move it to the tensorflow/contrib/lite/examples/android/app/src/main/assetsfolder within the running container. Moreover, the associated labels.txt containing the labels of the object classes must be placed in the same directory. Alternatively, if using a different naming, you must point the bazel BUILD file to the new model to include it in the APK assets. The BUILD file can be found in tensorflow/contrib/lite/examples/android/. Simply replace the references to detect.tflite and coco_labels_list.txt with your model and label list names. Additionally, the definitions of TF_OD_API_MODEL_FILE and TF_OD_API_LABELS_FILE in the tensorflow/contrib/lite/examples/android/app/src/main/java/org/tensorflow/demo/DetectorActivity.java must be updated with the new names.

Besides, further parameters such as the minimum detection confidence, output text size, etc. can be customized in this *.java file. Depending on the chosen model the configuration value TF_OD_API_INPUT_SIZTF_OD_API_INPUT_SIZE must be adjusted to the tensor shape of your model. The prepacked SSD MobileNet model for example is configured for the input shape 1,300,300,3 which corresponds to images with 300×300 pixels. The Android demo app transforms each camera frame to TF_OD_API_INPUT_SIZTF_OD_API_INPUT_SIZE × TF_OD_API_INPUT_SIZTF_OD_API_INPUT_SIZE pixels.

To build the APK we can use the build tool bazel. The build process for 64-bit ARMv7-A architecture can be started running following command from the tensorflow directory in the container:

$ bazel build -c opt --config=android_arm{,64} --cxxopt='--std=c++11'// tensorflow/lite/examples/android:tflite_demo

To build the APK for a different CPU architecture as for example the x86_64 platform and test the APK with an Android Emulator we can use:

$ bazel build -c opt --fat_apk_cpu=x86_64 --cxxopt='--std=c++11'// tensorflow/lite/examples/android:tflite_demo

The *.apk file is created within the bazel-bin/tensorflow/lite/examples/android/ directory.

Running the APK

After successfully building the APK it is ready to be installed on an Android mobile phone. To install the *.apk file on your own device you must enable developer options in the systems settings first. Subsequently the installation process on your USB connected phone can be started using the Android debugging bridge (ADB) provided with the Android platform tools. The ADB command for installing the package is

adb install tflite_demo.apk

Now you can run the app called TFL Detect to detect objects defined in the COCO dataset.

Originally published at https://www.steadforce.com.

--

--

Steadforce

Where data becomes Art. We create holistic, scalable and enterprise-grade digital platforms to get the most value out of your data.