Comments (3)
Hello, I've been pondering your question :)
Not sure about best design, but I think this comes down to using higher-order transformations that require a 'pure' function on a Haiku module inside of a transformed module. When transforming a module only the outer module function can be considered a pure function, which means the modules used internally are not. You should only call your function g
with an apply_fun
of a Haiku module the way it's currently defined.
To achieve this in your code snippet you would either call g
on your outer transformed apply
(but this restricts you to only use the custom_vjp on the complete network) or you transform the inner module to get a pure function internally. To do an inner transform in Haiku you need to use hk.experimental.lift
to correctly register the internal parameters that are created.
Something like this:
@partial(jax.custom_vjp, nondiff_argnums=(1,))
def g(x: jnp.ndarray, fun: Callable):
# TODO: write g to take params/rng as well
return jax.lax.stop_gradient(fun(x))
def g_fwd(x, fun):
return g(x, fun), x
def g_bwd(fun, res, grad):
x = res
# I think you might need some reshapes here to get the original input shapes to work
return fun(x),
g.defvjp(g_fwd, g_bwd)
def build_net(output_size):
def forward_fn(x: jnp.ndarray) -> jnp.ndarray:
linear = hk.Linear(output_size, name='l1')
x = linear(x)
transformed_linear = hk.without_apply_rng(hk.transform(linear))
inner_params = hk.experimental.lift(transformed_linear.init)(hk.next_rng_key(), x)
# Apply g to transformed function
return g(x, partial(transformed_linear.apply, inner_params))
return forward_fn
b_size, s_size, h_size = 3, 3, 3
input = jnp.ones((b_size, s_size, h_size))
rng = jax.random.PRNGKey(42)
net = build_net(h_size)
net = hk.transform(net)
params = net.init(rng, input)
def loss_fn(params, rng, x):
return jnp.sum(net.apply(params, rng, x))
print(jax.grad(loss_fn)(params, rng, input))
You will see an extra set of parameters in the lifted
name space. Do you think this would work for your use-case?
from dm-haiku.
(Apologies for the extremely late response, I'm cleaning up some of our stale issues)
As an aside, Is there anywhere I can read more to understand how transforming a hk module works?
About a year ago, this guide was added to our docs: https://dm-haiku.readthedocs.io/en/latest/notebooks/build_your_own_haiku.html. It goes through building a simple version of the Haiku internals from scratch, including how to implement hk.transform
.
I think the original question was answered, but feel free to follow-up here if you have any other questions. I'm still trying to understand where this lift
approach falls short, and how it could be better :)
from dm-haiku.
Thanks @LenaMartens!
As an aside, Is there anywhere I can read more to understand how transforming a hk module works? I felt wrapping my JAX function to take respect Haiku's internal states could have also worked.
from dm-haiku.
Related Issues (20)
- Warning: hk.LayerNorm when used in transformer decoder causes violation of autoregressive property HOT 1
- Reservoir Computing with Haiku
- Efficiency difference in using jax.lax.fori_loop vs looping over identical layers? HOT 2
- Please publish requirements.txt fix to pip
- How to use `apply` with additional parameters? HOT 1
- hk.Conv2DTranspose takes FOREVER to initialize and compile HOT 1
- 0.4.16 timeline HOT 2
- How to export haiku network parameters into Pytorch network?
- Modules got silently "reused" with `hk.vmap` HOT 2
- Wrong gradients in a Haiku network
- Direct Feedback Alignment
- Issue with wheels including docs and examples folder
- `haiku.experimental.flax` is not part of newest pip release HOT 1
- Train multiple hk.nets.MLP with one optimizer HOT 2
- TypeError: 'type' object is not subscriptable HOT 4
- Wrapping the ```init``` function inside ```jax.jit``` HOT 1
- Consider make flax an optional dependency HOT 1
- hk.switch does not work inside a hk.vmap function when hk.set_state is used HOT 1
- hk.BatchNorm with jax.vmap
- Integrating vmap with BatchNorm
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from dm-haiku.