Statistics for topic transformer
RepositoryStats tracks 638,604 Github repositories, of these 1,081 are tagged with the transformer topic. The most common primary language for repositories using this topic is Python (788). Other languages include: Jupyter Notebook (139), TypeScript (19), C++ (17)
Stargazers over time for topic transformer
Most starred repositories for topic transformer (view more)
Trending repositories for topic transformer (view more)
A high-throughput and memory-efficient inference and serving engine for LLMs
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
SGLang is a fast serving framework for large language models and vision language models.
Train a language model to chat like you using your personal conversations from WhatsApp, Telegram, Signal, or other platforms.
How Do LLMs Acquire New Knowledge? A Knowledge Circuits Perspective on Continual Pre-Training
Space Group Informed Transformer for Crystalline Materials Generation
DarkGPT Chat Explorer is an interactive web application that allows users to engage in conversations with various GPT (Generative Pre-trained Transformer) models in real-time. This repository contains...
[ECCV2024 Oral] Official implementation of the paper "Relation DETR: Exploring Explicit Position Relation Prior for Object Detection"
A high-throughput and memory-efficient inference and serving engine for LLMs
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
SGLang is a fast serving framework for large language models and vision language models.
Train a language model to chat like you using your personal conversations from WhatsApp, Telegram, Signal, or other platforms.
How Do LLMs Acquire New Knowledge? A Knowledge Circuits Perspective on Continual Pre-Training
Space Group Informed Transformer for Crystalline Materials Generation
This is the official repository for the paper TLOB: A Novel Transformer Model with Dual Attention for Stock Price Trend Prediction with Limit Order Book Data.
Community maintained hardware plugin for vLLM on Ascend
A high-throughput and memory-efficient inference and serving engine for LLMs
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
SGLang is a fast serving framework for large language models and vision language models.
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), ga...
Train a language model to chat like you using your personal conversations from WhatsApp, Telegram, Signal, or other platforms.
How Do LLMs Acquire New Knowledge? A Knowledge Circuits Perspective on Continual Pre-Training
🏄 [ICLR 2025] OVTR: End-to-End Open-Vocabulary Multiple Object Tracking with Transformer
A collection of tricks and tools to speed up transformer models
This is the official repository for the paper TLOB: A Novel Transformer Model with Dual Attention for Stock Price Trend Prediction with Limit Order Book Data.
🔍 An LLM-based Multi-agent Framework of Web Search Engine (like Perplexity.ai Pro and SearchGPT)
Open-source industrial-grade ASR models supporting Mandarin, Chinese dialects and English, achieving a new SOTA on public Mandarin ASR benchmarks, while also offering outstanding singing lyrics recogn...
[CVPR 2025] Video Depth Anything: Consistent Depth Estimation for Super-Long Videos
Bringing together outstanding artificial intelligence (AI) open source projects from around the world.
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
A high-throughput and memory-efficient inference and serving engine for LLMs
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), ga...
🔍 An LLM-based Multi-agent Framework of Web Search Engine (like Perplexity.ai Pro and SearchGPT)
Open-source industrial-grade ASR models supporting Mandarin, Chinese dialects and English, achieving a new SOTA on public Mandarin ASR benchmarks, while also offering outstanding singing lyrics recogn...
Official code, datasets and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)
This repository offers a collection of recent time series research papers, including forecasting, anomaly detection and so on , with links to code and resources.