Model Support

In addition to the rich set of models available as part of neon and the model zoo, the Nervana Cloud can flexibly integrate with users’ custom code extensions and third party libraries. Also note that this release supports TensorFlow.

Model Zoo

The Nervana model zoo contains a selection of state-of-the-art neon model implementations that can be used as a starting point for a variety of tasks.

Some of the models may be pre-loaded for your tenant and are accessible via the -z flag for certain ncloud commands, for example ncloud model list -z will list any such models.

If a particular zoo model is not present, you can either contact us to have it added, or manually import it like so (this works for any trained neon model):

$ ncloud model import -n alexnet

Note that importing models may take some time depending on file size and network bandwidth.

Custom Code

If you require extending core neon (for instance having written a custom layer type or cost function), the Intel® Nervana™ Cloud needs to be aware of these implementations.

At training time the means by which this can be done is via the specification of a --custom-code-url repository flag to the ncloud model train command. This URL can point to any public or Nervana accessible git repository (contact us if you need to arrange access to a private repository).

This same argument is also valid for interactive sessions ncloud interact start, and model deployments for inference ncloud model deploy.

The format and content of this repository is up to you, but we will key off of certain files as described below.

Note that you can also control which version of the repository is checked out via the optional --custom-code-commit flag. If not specified, the tip of master is checked out.

Additional Compilation

If you define a Makefile in the root of your custom code repository, the default target will be executed as part of launching your job in the container.

Additional Python Libraries

If you require additional third-party python libraries to be present in the executing environment on the Cloud, these can be setup by creating a requirements.txt file in the root directory of your custom code repository. The format of this file is as used by pip and described in requirements files