11 results found Sort:

PyTorch implementation of some attentions for Deep Learning Researchers.
Created 2020-03-21
89 commits to master branch, last one 2 years ago
125
492
mpl-2.0
24
Deep Xi: A deep learning approach to a priori SNR estimation implemented in TensorFlow 2/Keras. For speech enhancement and robust ASR.
Created 2018-05-25
425 commits to master branch, last one 2 years ago
145
471
bsd-3-clause
11
[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
Created 2021-03-01
110 commits to main branch, last one 9 months ago
29
331
mit
6
Exploring attention weights in transformer-based models with linguistic knowledge.
Created 2020-10-30
186 commits to master branch, last one 8 months ago
"Attention, Learn to Solve Routing Problems!"[Kool+, 2019], Capacitated Vehicle Routing Problem solver
Created 2020-06-24
89 commits to master branch, last one 3 years ago
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Created 2018-07-04
59 commits to master branch, last one 2 years ago
A Faster Pytorch Implementation of Multi-Head Self-Attention
Created 2020-07-28
6 commits to master branch, last one 2 years ago
Multi^2OIE: Multilingual Open Information Extraction Based on Multi-Head Attention with BERT (Findings of ACL: EMNLP 2020)
Created 2020-09-17
24 commits to master branch, last one about a year ago
8
36
apache-2.0
5
Self-Supervised Vision Transformers for multiplexed imaging datasets
Created 2023-01-16
32 commits to master branch, last one 16 days ago
several types of attention modules written in PyTorch
Created 2023-06-28
43 commits to main branch, last one about a month ago