Statistics for topic transformer
RepositoryStats tracks 518,325 Github repositories, of these 887 are tagged with the transformer topic. The most common primary language for repositories using this topic is Python (648). Other languages include: Jupyter Notebook (114), TypeScript (16), C++ (15)
Stargazers over time for topic transformer
Most starred repositories for topic transformer (view more)
Trending repositories for topic transformer (view more)
Lumina-T2X is a unified framework for Text to Any Modality Generation
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A high-throughput and memory-efficient inference and serving engine for LLMs
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gan...
Lumina-T2X is a unified framework for Text to Any Modality Generation
Official code and checkpoints for "Timer: Transformers for Time Series Analysis at Scale"
[CVPR 2024] Official implementation of the paper "Salience DETR: Enhancing Detection Transformer with Hierarchical Salience Filtering Refinement"
Lumina-T2X is a unified framework for Text to Any Modality Generation
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A high-throughput and memory-efficient inference and serving engine for LLMs
Lumina-T2X is a unified framework for Text to Any Modality Generation
Code and Data artifact for NeurIPS 2023 paper - "Monitor-Guided Decoding of Code LMs with Static Analysis of Repository Context". `multispy` is a lsp client library in Python intended to be used to bu...
Official code and checkpoints for "Timer: Transformers for Time Series Analysis at Scale"
A high-throughput and memory-efficient inference and serving engine for LLMs
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gan...
Lumina-T2X is a unified framework for Text to Any Modality Generation
[ICML 2024] A novel, efficient approach combining convolutional operations with adaptive spectral analysis as a foundation model for different time series tasks
Efficient Infinite Context Transformers with Infini-attention Pytorch Implementation + QwenMoE Implementation + Training Script + 1M context keypass retrieval
Official code and checkpoints for "Timer: Transformers for Time Series Analysis at Scale"
[Mamba-Survey-2024] Paper list for State-Space-Model/Mamba and it's Applications
[ICML 2024] LLMCompiler: An LLM Compiler for Parallel Function Calling
Lumina-T2X is a unified framework for Text to Any Modality Generation
Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gan...
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A high-throughput and memory-efficient inference and serving engine for LLMs
Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah
[ICML 2024] LLMCompiler: An LLM Compiler for Parallel Function Calling
🤖 A PyTorch library of curated Transformer models and their composable components
"Retinexformer: One-stage Retinex-based Transformer for Low-light Image Enhancement" (ICCV 2023) & (NTIRE 2024 Challenge)