InstallSyntaxNetServer - GateNLP/gateplugin-Tagger_SyntaxNet GitHub Wiki

Installing the SyntaxNet Server

NOTE: this may still be incomplete (especially the preconditions), please report any omissions you can find!

NOTE: this currently only works with the English model, please give feedback if you have found how to run this with a different language!

  • install bazel version 0.2.2b (not 0.2.3!)
  • install Python package protobuf version 3.0.0b2
  • install Python packages asciitree, numpy
  • install gRPC
  • install the following Linux (Ubuntu) packages: build-essential curl libfreetype6-dev libpng12-dev libzmq3-dev pkg-config python-dev software-properties-common swig zip zlib1g-dev (copied from https://tensorflow.github.io/serving/setup)

The actual installation is then based on https://github.com/dsindex/syntaxnet.git, the following instructions are essentially identical to these instructions: https://github.com/dsindex/syntaxnet/blob/master/README_api.md

git clone https://github.com/dsindex/syntaxnet.git work
## (the tested version was a58c72b6ed5c55dca52a3a339b8d3cbddbc548e0)
cd work
git clone --recurse-submodules https://github.com/tensorflow/serving
## (tested version was 30fffbd3f4d780246311b5cadca624d4389447b4)
cd serving/tf_models
git checkout a4b7bb9a5dd2c021edcd3d68d326255c734d0ef0
cd ../..
cd serving/tf_models
patch -p1 < ../../api/pr250-patch-a4b7bb9a.diff.txt
cd ../../
cd serving/tensorflow
./configure
##  (Answer the questions asked by the configure script, if GPU support is selected, answer additional questions about that)
cd ../../
cp api/modified_workspace.bzl serving/tensorflow_serving/workspace.bzl
cat api/append_BUILD >> serving/tensorflow_serving/example/BUILD
cp api/parsey_api* serving/tensorflow_serving/example/
cd serving
bazel --output_user_root=bazel_root build --nocheck_visibility -c opt -s //tensorflow_serving/example:parsey_api --genrule_strategy=standalone --spawn_strategy=standalone --verbose_failures
ln -s ./tf_models/syntaxnet/syntaxnet syntaxnet

Now you can run the server: ./bazel-bin/tensorflow_serving/example/parsey_api --port=9000 ../api/parsey_model