4 results found Sort:
- Filter by Primary Language:
- Python (2)
- C++ (1)
- Jupyter Notebook (1)
- +
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
pytorch
seq2seq
sockeye
transformer
translation
deep-learning
attention-model
encoder-decoder
machine-learning
attention-mechanism
machine-translation
transformer-network
deep-neural-networks
sequence-to-sequence
transformer-architecture
attention-is-all-you-need
neural-machine-translation
sequence-to-sequence-models
Created
2017-06-08
836 commits to main branch, last one 28 days ago
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Created
2019-06-19
53 commits to master branch, last one 2 years ago
Code for the paper "STConvS2S: Spatiotemporal Convolutional Sequence to Sequence Network for Weather Forecasting" (Neurocomputing, Elsevier)
Created
2019-11-20
14 commits to master branch, last one 3 years ago
An Implementation of Transformer (Attention Is All You Need) in DyNet
Created
2017-11-01
240 commits to master branch, last one 6 years ago