5 results found Sort:

list of efficient attention modules
This repository has been archived (exclude archived)
Created 2020-07-31
46 commits to master branch, last one 3 years ago
A Faster Pytorch Implementation of Multi-Head Self-Attention
Created 2020-07-28
6 commits to master branch, last one 2 years ago
Flexible Python library providing building blocks (layers) for reproducible Transformers research (Tensorflow ✅, Pytorch 🔜, and Jax 🔜)
Created 2021-07-10
504 commits to master branch, last one 10 months ago
여러가지 유명한 신경망 모델들을 제공합니다. (DCGAN, VAE, Resnet 등등)
Created 2018-08-04
71 commits to master branch, last one 3 years ago