theory/
: AD background theory, introducing the concept of forward and reverse
mode plus Jacobian-vector / vector-Jacobian products. To go deeper, make sure
to check out the excellent JAX autodiff cookbook as
well as @mattjj's talk on autograd.
talk/
: Talk version of those notes. The talk was given at the @hzdr
http://helmholtz.ai local unit's Machine Learning journal club and at a @hzdr
and @casus workshop on physics-informed neural networks, both organized by Nico
Hoffmann (@nih23).
examples/
: AD examples using autograd, jax and pytorch. The examples
focus mostly on how to define custom derivatives in jax (and autograd). This
has helped to understand how Jacobian-vector products actually work. More
examples to come!
Download talk
and theory
PDF files from the Releases page or
the latest CI run. You can also click the badges above. The talk is
also available via figshare.