11 results found Sort:
- Filter by Primary Language:
- Python (8)
- Jupyter Notebook (1)
- MATLAB (1)
- Svelte (1)
- +
PyTorch implementation of some attentions for Deep Learning Researchers.
Created
2020-03-21
89 commits to master branch, last one 2 years ago
Deep Xi: A deep learning approach to a priori SNR estimation implemented in TensorFlow 2/Keras. For speech enhancement and robust ASR.
Created
2018-05-25
425 commits to master branch, last one 2 years ago
[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
Created
2021-03-01
110 commits to main branch, last one 9 months ago
Exploring attention weights in transformer-based models with linguistic knowledge.
Created
2020-10-30
186 commits to master branch, last one 8 months ago
"Attention, Learn to Solve Routing Problems!"[Kool+, 2019], Capacitated Vehicle Routing Problem solver
Created
2020-06-24
89 commits to master branch, last one 3 years ago
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Created
2018-07-04
59 commits to master branch, last one 2 years ago
A Faster Pytorch Implementation of Multi-Head Self-Attention
Created
2020-07-28
6 commits to master branch, last one 2 years ago
Multi^2OIE: Multilingual Open Information Extraction Based on Multi-Head Attention with BERT (Findings of ACL: EMNLP 2020)
Created
2020-09-17
24 commits to master branch, last one about a year ago
Self-Supervised Vision Transformers for multiplexed imaging datasets
Created
2023-01-16
32 commits to master branch, last one 16 days ago
This is the official repository of the original Point Transformer architecture.
Created
2021-08-24
9 commits to main branch, last one 2 years ago
several types of attention modules written in PyTorch
Created
2023-06-28
43 commits to main branch, last one about a month ago