Compositional Deep Learning in Futhark
We present a design pattern for composing deep learning networks in a typed, higher-order fashion. The exposed library functions are generically typed and the composition structure allows for networks to be trained (using back-propagation) and for trained networks to be used for predicting new results (using forward-propagation). Individual layers in a network can take different forms ranging over dense sigmoid layers to convolutional layers.
The paper discusses different typing techniques aimed at enforcing proper use and composition of networks.
The approach is implemented in Futhark, a data-parallel functional language and compiler targeting GPU architectures, and we demonstrate that Futhark’s elimination of higher-order functions and modules leads to efficient generated code.
Sun 18 AugDisplayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change
13:40 - 14:50 | |||
13:40 23mTalk | Compositional Deep Learning in Futhark FHPNC Duc Minh Tran DIKU, University of Copenhagen, Troels Henriksen University of Copenhagen, Denmark, Martin Elsman University of Copenhagen, Denmark Link to publication | ||
14:03 23mTalk | Towards Hasktorch 1.0: Automated Generation of C++ Libtorch Bindings (extended abstract) FHPNC | ||
14:26 23mTalk | Hailstorm : A statically typed functional language for systems programming (extended abstract) FHPNC |