Training Rasa bot v3 full model on Colab and run it locally

Stephen Cow Chau
4 min readDec 1, 2021

--

Background

I would like to quick test with Rasa bot and figured out their rasa full included a tensorflow model, it’s impossible to complete the training on my 5-year-old laptop that have no CUDA supported GPU.

So what I plan to do is to train the model on Google Colab free GPU and download the trained model/files to desktop, and mount it back to docker container to test run.

Setup Rasa bot on Colab

First off, make sure the Colab is running a GPU instance.

The first step is to install Rasa through pip, as of today (2021/12/1), the pip version is 21.1.3, which the resolving of package would take ages (if not forever), so it’ recommended to update the pip (update to 21.3.1).

The older pip would try each subversion if multiple version of the package match the dependency (e.g. if dependency say version is <17.0 and ≥ 16.7 might trigger download of version 16.9, 16.8 and 16.7). Which the new pip seems resolve the dependency more efficiently and only trigger 1 download per package.

pip v21.3.1 took 2 mins to finish install packages while v21.1.3 took > 8 mins (which I stop waiting more)

Then we can run the rasa init, but given the Colab shell does not support the interactivity needed, so it would stuck running (and the script is pending using confirmation)

So one would run with “--no prompt” option

Note that the init with training took only 1 min, with a K80 assigned

For my 5-year-old CPU only laptop, it would wait for ages at the training step for DIETClassifier (which is a tensorflow model):

https://github.com/RasaHQ/rasa/blob/main/rasa/nlu/classifiers/diet_classifier.py

And the model and configs are initialized and trained

Finally zip the file for download:

Deploy the trained model to desktop (with Docker)

First, copy the file in zip to a folder (I called in “rasa_data”)

Then prepare a docker-compose.yaml as below, and using a mount volume to mount the rasa_data folder to “/app” path in the docker container

version: '3.8'
services:
rasa:
image: rasa/rasa:3.0.0-full
container_name: rasa
ports:
- 5005:5005
volumes:
- ./rasa_data:/app
command:
- "run"
- "--enable-api"
docker-compose.yaml outside the folder, as sibling

Running “docker-compose up” would have following:

The rasa loading of data model seems take up a lot of disk I/O, stay tune…

Finally took 19 mins to load the model, and yet another 15 mins to make the server up and running…

And curl localhost:5005 and get:

And the disk utilization is no longer high:

(A 2nd run afterwards does not experience the same high disk loading and run much faster, not sure why)

--

--

Responses (1)