From 35C3 Wiki
|Description||The RWTH extensible training framework for universal recurrent neural networks|
|Persons working on||Albertzeyer|
|Tags||asr, deep learning, speech recognition, tensorflow|
(Click here to refresh this page.)
RETURNN - RWTH extensible training framework for universal recurrent neural networks, is a Theano/TensorFlow-based implementation of modern recurrent neural network architectures. It is optimized for fast and reliable training of recurrent neural networks in a multi-GPU environment.
It is currently used for speech recognition, handwriting recognition, machine translation and language modeling. It supports all kind of different neural network models, both the hybrid HMM/NN (together with the RASR tool) and all kind of end-to-end approaches such as encoder-decoder-attention models.
The code is open (GitHub link, documentation). It's the code developed at our university departement (RWTH Aachen University). Get in touch if you are interested in any of these topics, or also deep learning / neural networks in general.