Hand pose detection using Tensorflow Js & React Native

Indranil Majumder
2 min readApr 7, 2021

--

Hand Pose detection like thumbs up (đź‘Ť) or thumbs down (đź‘Ž) can be very crucial in man machine interfacing and (semi) automation.

For example in hazardous areas or areas with a lot of network connectivity interferences a remote/drone camera with hand pose detection integrated with backend operational software can help in better coordination of work, thereby enhancing safety at work place.

With Tensorflow’s handpose model we can actually do this!

Initially I started developing with expo snack as below :-

But making the tensorflow backend rn-webgl work in expo snack was running into various errors. So had to switch to cpu backend. Still it was slow and unreliable.

Therefore I started with fresh new react-native project :-

Step 1 > Initialize via React Native CLI Quickstart

npx react-native init tensorflowjshandpose --npm

Step 2 > In the new project add required dependencies :-

Step 2a > Pay attention & follow the installation of react-native-unimodules

Step 2b > Other expo dependencies like expo-camera

Step 2c > expo-camerauses expo-gl-cpp which needs Android NDK. So install that too as mentioned below :-

https://developer.android.com/studio/projects/install-ndk

Note : Make sure to update the ndk version in gradle.build to the one installed above

Step 3 > Add the actual pose detection code and test it on Android:-

Sample console logs for hand pose detection

Checkout the code here in Github!

--

--