Want to train your model over the network? Just add an endpoint for a backward pass to network.serve:
sm.network.serve({ run_model: (user, input) => { constout = model(input) // capture a backward pass user.opt = (jacobian) => { sm.optim.sgd(out.backward(jacobian), 1e-3) } returnout } optimize_model: (user, jacobian) => { // run it when that same user gives us a jacobian user.opt(jacobian) } })
And the client can feed the jacobian with network.tfetch:
Want to run more than just a trivial remote trainer? Below is a distributed model parallel and pipelined server. We invoke multiple remote models and then make our own model server.
Below are some examples of how to use functions in the
networknamespace.Inference
Want to expose your model to the network? Use
network.serve:A client can use
network.tfetch(basicallyfetchbut for Tensors):Training
Want to train your model over the network? Just add an endpoint for a backward pass to
network.serve:And the client can feed the jacobian with
network.tfetch:Wrappers
Shumai provides wrapper for the above setup code.
network.serve_modelwill create/forwardand/optimizeendpoints for you.And the client can attach with
network.remote_model, which attaches a hook tobackwardfor automatic gradients.Composition
Want to run more than just a trivial remote trainer? Below is a distributed model parallel and pipelined server. We invoke multiple remote models and then make our own model server.
Same client as before :)
What about debugging?
All
network.serve*methods automatically give us basic/statisticsas JSON:but we can always add more:
including recursively: