# Your search

MACHINE LEARNING

## Backprop as Functor: A compositional perspective on supervised learning

Resource type

Authors/contributors

- Fong, Brendan (Author)
- Spivak, David I. (Author)
- Tuyéras, Rémy (Author)

Title

Backprop as Functor: A compositional perspective on supervised learning

Abstract

A supervised learning algorithm searches over a set of functions $A \to B$ parametrised by a space $P$ to find the best approximation to some ideal function $f\colon A \to B$. It does this by taking examples $(a,f(a)) \in A\times B$, and updating the parameter according to some rule. We define a category where these update rules may be composed, and show that gradient descent---with respect to a fixed step size and an error function satisfying a certain property---defines a monoidal functor from a category of parametrised functions to this category of update rules. This provides a structural perspective on backpropagation, as well as a broad generalisation of neural networks.

Publication

arXiv:1711.10455 [cs, math]

Date

2019-05-01

Short Title

Backprop as Functor

Accessed

2019-11-23T14:42:07Z

Library Catalog

Extra

ZSCC: 0000015 arXiv: 1711.10455

Notes

Comment: 13 pages + 4 page appendix

Citation

Fong, B., Spivak, D. I., & Tuyéras, R. (2019). Backprop as Functor: A compositional perspective on supervised learning.

*ArXiv:1711.10455 [Cs, Math]*. Retrieved from http://arxiv.org/abs/1711.10455
MACHINE LEARNING

Attachment

Link to this record