Statistics for topic pretrained-models
RepositoryStats tracks 564,918 Github repositories, of these 203 are tagged with the pretrained-models topic. The most common primary language for repositories using this topic is Python (154). Other languages include: Jupyter Notebook (26)
Stargazers over time for topic pretrained-models
Most starred repositories for topic pretrained-models (view more)
Trending repositories for topic pretrained-models (view more)
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT)...
Chinese version of CLIP which achieves Chinese cross-modal retrieval and representation generation.
The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for fo...
Fight Detection From Surveillance Cameras by fine-tuning a PyTorch Pretrained Model
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
The official implementation of Achieving Cross Modal Generalization with Multimodal Unified Representation (NeurIPS '23)
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT)...
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for fo...
Fight Detection From Surveillance Cameras by fine-tuning a PyTorch Pretrained Model
The official code of "CSTA: CNN-based Spatiotemporal Attention for Video Summarization"
The official implementation of Achieving Cross Modal Generalization with Multimodal Unified Representation (NeurIPS '23)
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT)...
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for fo...
The official code of "CSTA: CNN-based Spatiotemporal Attention for Video Summarization"
It is a comprehensive resource hub compiling all LLM papers accepted at the International Conference on Learning Representations (ICLR) in 2024.
Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting
The official implementation of Achieving Cross Modal Generalization with Multimodal Unified Representation (NeurIPS '23)
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
a state-of-the-art-level open visual language model | 多模态预训练模型
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT)...
Superduper: Integrate AI models and machine learning workflows with your database to implement custom AI applications, without moving your data. Including streaming inference, scalable model hosting, ...
a state-of-the-art-level open visual language model | 多模态预训练模型
Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting
Superduper: Integrate AI models and machine learning workflows with your database to implement custom AI applications, without moving your data. Including streaming inference, scalable model hosting, ...