Building the TensorFlow android example app on Mac OS

The past year has been a really interesting time for AI. There have been a number of breakthroughs, with AI techniques finally leading to things like improved image recognition, better sentence understanding and conversational assistants finally finding their way into commercial products.

One interesting development is googles release of TensorFlow – their library for building AI systems. TensorFlow contains Python and C++ components to make it easy to implement AI techniques like neural nets and run them across a wide range of hardware.

The TensoFlow codebase includes a fun Android project which runs the Inception5h model using it to recgnise whatever the phones camera sees.

Sample app detecting bananas

If you want to try the app out, I’ve hosted an APK here

The Inception5H model is trained using the imagenet data and can recignise a list of 1000 different objects. Google also provide versions of the Inception5h model which are suitable for training with your own image data – so, with a very fast computer, a good training data set and a lot of patience, you could train it to identify whatever you like: Insects, different types of sneakers, pokemon cards, etc

Building the TensorFlow Android example app on Mac OS

Unfortunately building the example Android app is not a straightforward process. TensorFlow uses a build system called Bazel and has a number of other dependenceis that the typical Android developer does not have installed. To build the TensorFlow Android example app, you need to build the complete TensorFlow system from source – it’s not available as a library you can just drop into an Android project. The app itself is also built using Bazel and not the standard android build tools.

These instructions assume that you already have a working Android development environment setup.

These instructions are valid as of June 2016.

Install Homebrew on Mac OS X

If you don’t have homebrew installed, the first step is to install it:

$ /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

Use brew to install bazel and swig

Once you have got homebrew, you can use it to install the Bazel build system and a tool called Swig, which is used for generating language wrappers:

$ brew install bazel swig

Install python dependencies

A number of python dependencies are also needed. These can be installed using the easy_install tool:

$ sudo easy_install -U six
$ sudo easy_install -U numpy
$ sudo easy_install wheel

Clone the TensorFlow git repo

Now the moment you have all been waiting for! It’s time to get tensorflow:

$ git clone https://github.com/tensorflow/tensorflow

Configure TensorFlow Build

The first step is to configure TensorFlow by running ./configure in the TensorFlow root dir

$ ./configure

I just said no to all the questions! They mainly relate to using a GPU to train models – something that we don’t need to do if we want to use a pre-existing model.

Once configure is complete, you need to edit the WORKSPACE file in in the TensorFlow root dir to setup your android SDK

# Uncomment and update the paths in these entries to build the Android demo.
android_sdk_repository(
name = "androidsdk",
api_level = 23,
build_tools_version = "23.0.1",
# Replace with path to Android SDK on your system
path = "/Users/luke/android-sdk/",
)

android_ndk_repository(
name=”androidndk”,
path=”/Users/luke/android-ndk/android-ndk-r10e/”,
api_level=21)

One thing to note is that I am using NDK e10e. This is NOT the latest version of the NDK. There is currently an open bug in TensorFlow which causes the build to fail with the message:
no such package '@androidndk//': Could not read RELEASE.TXT in Android NDK
It seems that the TensorFlow build system is looking for RELEASE.TXT to detect the Android NDK – which is no longer present in newer versions of the NDK.

Download inception5h model

Everything should now be setup to build and run TensorFlow. However for the Android example app we need to get the inception5h model, which is not checked into the TensorFlow repo:

$ wget https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip -O /tmp/inception5h.zip
$ unzip /tmp/inception5h.zip -d tensorflow/examples/android/assets/

Build the example

We are finally ready to build the Android example app. From the root TensorFlow directory run:

$ bazel build //tensorflow/examples/android:tensorflow_demo

This will build TensorFlow and the android example app which uses it. If the build completes successfully, the TensorFlow directory will contain a bazel-bin/tensorflow/examples/android directory, which contains amongst other things an APK file suitable for installing on your device.

Recognise things!

The app is fun to play with, but you might find some limitations in its abilities:

The sample app recognising my face as a bowtie

The list of things it can recognize is in tensorflow/examples/android/assets/imagenet_comp_graph_label_strings.txt. You will notice that there are a lot of things you’re not likely to find in your house or office – hundreds of different breeds of dogs, aircraft carriers, the space shuttle and airships. It also doesn’t have the ability to recognise human faces, so if you point it at a person it’s most likely to detect a ‘bowtie’. Apparently our eyes and nose form roughly the same shape. I suggest pointing it at coffee cups, various bits of fruit and wall clocks – all of which it recognises well.