Hurdles I got over using tensorflow.js deeplab v3 segmentation model

Stephen Cow Chau
3 min readJan 13, 2021

--

I wish this help someone who got as loss as I was.

The version I am working on is on git master branch of tfjs-models (commit is e80d693bb43cb0ef234b808021c4def434ea816a)

The main problem

When I try to run model.segment(), it complain the model is expecting int32 as input, but not float32.

Reason

The debugging is tough as tensorflow-model and tfjs-core are compiled as distribution library and no source code debugging is available.

Digging into the source code, I trace back the segment() called the predict() of the same class in https://github.com/tensorflow/tfjs-models/blob/master/deeplab/src/index.ts.

The interesting call is the call to toInputTensor() before passing to the model.

public predict(input: DeepLabInput): tf.Tensor2D {
return tf.tidy(() => {
const data = toInputTensor(input);
return tf.squeeze(this.model.execute(data) as tf.Tensor);
});
}

And this function is where it convert the input data from int32 to float32, which is caused by tf.image.resizeBilinear()

export function toInputTensor(input: DeepLabInput) { 
return tf.tidy(() => {
const image = input instanceof tf.Tensor ? input : tf.browser.fromPixels(input);
const [height, width] = image.shape; const resizeRatio = config[‘CROP_SIZE’] / Math.max(width, height);
const targetHeight = Math.round(height * resizeRatio); const targetWidth = Math.round(width * resizeRatio);
return tf.image.resizeBilinear(image, [targetHeight, targetWidth]).expandDims(0);
});
}

And finally triggered the assertion error at tfjs-converter/src/executor/graph_executor “checkInputShapeAndType” (this is the call you might see the name at call stack when you debug in developer tool)

My Solution

Given knowing the root cause, I would like to chnage that one problematic line and adding .toInt() to the data tensor that have been converted to float32 by toInputTensor().

public predict(input: DeepLabInput): tf.Tensor2D {
return tf.tidy(() => {
const data = toInputTensor(input);
return tf.squeeze(this.model.execute(data.toInt()) as tf.Tensor);
});
}

Step 1 — folk the git repo tfjs-models from tensorflow and update the code

Step 2 — add that into your package.json (or using npm/yarn)

Update package.json to include the line in dependency, note that I use a scope “@”in front of the referred package name tfmodel, this is the only way I can make it work with package production build

"@tfmodels": "git+https://github.com/<your_user_name>/tfjs-models.git",

Run yarn to install.

Step 3 — Build the deeplab submodule

As the code is build with typescript, so the package added into node_module cannot be import directly (if you import, you would get a module cannot resolve error)

You would need to change directory to the deeplab folder (/path/to/project/node_module/tensorflow-models/deeplab), then run:

yarn run build

checking package.json script section at deeplab directory, the build command would try to run “rimraf dist && tsc”, it would require you to install rimraf (using npm or yarn to add to global)

yarn global add rimraf

Install typescript to execute tsc command:

sudo apt-get install node-typescript

You might see some error as I do, but magically it still run OK:

Step 4 — Change the import

instead of

import * as deeplab from ‘@tensorflow-models/deeplab’;

change to

import * as deeplab from ‘@tfmodels/deeplab’;

Result — it run~

Should you have any problem or better suggestion, feel free to comment.

Other experiment

The deeplab is a Tensorflow.js v1 model,I have tried BOTH tfjs 1.3.1 and 2.8.3, they both work OK, in package.json dependency section:

“@tensorflow/tfjs”: “1.3.1”,
“@tensorflow/tfjs-converter”: “1.3.1”,

OR

“@tensorflow/tfjs”: “2.8.3”,
“@tensorflow/tfjs-converter”: “2.8.3”,

--

--