Want to train your model over the network? Just add an endpoint for a backward pass to network.serve:
sm.network.serve({ run_model: (user, input) => { constout = model(input) // capture a backward pass user.opt = (jacobian) => { sm.optim.sgd(out.backward(jacobian), 1e-3) } returnout } optimize_model: (user, jacobian) => { // run it when that same user gives us a jacobian user.opt(jacobian) } })
And the client can feed the jacobian with network.tfetch:
Want to run more than just a trivial remote trainer? Below is a distributed model parallel and pipelined server. We invoke multiple remote models and then make our own model server.
Below are some examples of how to use functions in the
network
namespace.Inference
Want to expose your model to the network? Use
network.serve
:A client can use
network.tfetch
(basicallyfetch
but for Tensors):Training
Want to train your model over the network? Just add an endpoint for a backward pass to
network.serve
:And the client can feed the jacobian with
network.tfetch
:Wrappers
Shumai provides wrapper for the above setup code.
network.serve_model
will create/forward
and/optimize
endpoints for you.And the client can attach with
network.remote_model
, which attaches a hook tobackward
for automatic gradients.Composition
Want to run more than just a trivial remote trainer? Below is a distributed model parallel and pipelined server. We invoke multiple remote models and then make our own model server.
Same client as before :)
What about debugging?
All
network.serve*
methods automatically give us basic/statistics
as JSON:but we can always add more:
including recursively: