网站首页 文章专栏 部署深度学习模型到服务器
部署深度学习模型到服务器
创建于:2019-06-22 06:24:00 更新于:2024-04-24 00:22:30 羽瀚尘 1207
深度学习 深度学习


简介

当我们历尽千辛万苦,终于训练出来一个模型时,想不想将这个模型发布出去,让更多人的受益?

考虑到tensorflow模型运行的方式,自己手动部署可能会占用比较多的内存。幸好已经有tensorflow-model-server软件包专门用于解决tensorflow模型的部署问题。

该软件以pb格式的模型和variable来重建运算图,并提供rest api。

本文在训练阶段使用docker,serve使用docker,与服务器交互使用virtualenv。github地址在这里

要点

数据集

如果fashion-mnist数据集下载不了,可以直接将已经下载好的放到~/.keras/datasets/fashion-mnist.

serve docker使用


代码摘录自[1]
”`sh
TESTDATA=“$(pwd)/serving/tensorflow_serving/servables/tensorflow/testdata”

Start TensorFlow Serving container and open the REST API port

docker run -t –rm -p 8501:8501 \
-v “$TESTDATA/saved_model_half_plus_two_cpu:/models/half_plus_two” \
-e MODEL_NAME=half_plus_two \
tensorflow/serving &
其中TESTDATA就是存放训练好的模型的地方。本文的模型保存目录是有版本号的,即目录为deploy/1/, rest api访问也是带版本号/v1/models/fashion_mnist 也可以自行安装tensorflow-model-server`到系统中,参考[2]

使用方法

step 1 trian your model


Run this command to start a docker container.
sh ./run.sh

This will give you the bash interface from docker, our source code is located at /workspace (inside docker)

Run this command to train your model and save it.
sh cd /workspace/src python train.py

step 2 serve your model


Run this command to start a docker serve container
sh ./serve.sh

This will serve a tensorflow model and expose port 8501.

Check it by this command

sh curl localhost:8501/v1/models/fashion_mnist

It will output:
sh { "model_version_status": [ { "version": "1", "state": "AVAILABLE", "status": { "error_code": "OK", "error_message": "" } } ] }

Then you can continue to check it with fashion mnist data.

step 3 Access your model by rest api


I use virtualenv to run src/predict.py. It’s stupid and another docker container should be used instead. I will fix it soon.

setup virtual environment

sh virtualenv tf1.13.1 virtualenv -p /usr/bin/python3 tf.13.1 source tf1.13.1/bin/activate pip install requirements.txt -r

run!

sh cd src python predict.py

outputs should look like this:
sh train_images.shape: (60000, 28, 28, 1), of float64 test_images.shape: (10000, 28, 28, 1), of float64 Data: {"signature_name": "serving_default", "instances": ... [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0]]]]} The model thought this was a Ankle boot (class 9), and it was actually a Ankle boot (class 9) The model thought this was a Pullover (class 2), and it was actually a Pullover (class 2) The model thought this was a Trouser (class 1), and it was actually a Trouser (class 1)

参考

[1] tensorflow serving
[2] tensorflow serving example from google

TODO
- [ ] 将predict.py的运行转到docker中
- [ ] 提供一个web界面
- [ ] 提供更多的模型部署实例