Some python utilities when using tensorflow-serving.
Prepare an environment with python version >= 3.6
From PYPI:
- Manually install tensorflow CPU or GPU version.
pip install serving-utils
From Github repository:
git clone [email protected]:Yoctol/serving-utils.git
- Manually install tensorflow CPU or GPU version.
make install
- Saver and Loader
import tensorflow as tf
from serving_utils.saver import Saver
from serving_utils.loader import Loader
saver = Saver(
session=tf.Session(graph=your_graph),
output_dir='/path/to/serving',
signature_def_map={
'predict': tf.saved_model.signature_def_utils.predict_signature_def(
inputs={'input': tf.Tensor...},
outputs={'output': tf.Tensor...},
)
},
freeze=True, # (default: True) Frozen graph will be saved if True.
)
saver.save()
loader = Loader(
path='/path/to/serving',
# version=1, # if not specified, use the latest version
)
new_sess = tf.Session()
loader.load(new_sess) # load the saved model into new session
- Client
from serving_utils import Client
client = Client(host="localhost", port=8500, n_trys=3)
client.predict(
{'input': np.ones(1, 10)},
output_names=['output'],
model_signature_name='predict',
)
# or async
await client.async_predict(...)
- Freeze graph
from serving_utils.freeze_graph import freeze_graph, create_session_from_graphdef
frozen_graph_def = freeze_graph(session, output_op_names)
new_session = create_session_from_graphdef(frozen_graph_def)
Run the following commands:
make lint
make test
make install-dev
python -m grpc_tools.protoc -I. --python_out=. --python_grpc_out=. --grpc_python_out=. serving_utils/protos/*.proto