Statistics for topic pretrained-models
RepositoryStats tracks 584,796 Github repositories, of these 206 are tagged with the pretrained-models topic. The most common primary language for repositories using this topic is Python (157). Other languages include: Jupyter Notebook (25)
Stargazers over time for topic pretrained-models
Most starred repositories for topic pretrained-models (view more)
Trending repositories for topic pretrained-models (view more)
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT)...
Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting
The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for fo...
Music Audio Representation Benchmark for Universal Evaluation
[CVPR 2024 Extension] 160K volumes (42M slices) datasets, new segmentation datasets, 31M-1.2B pre-trained models, various pre-training recipes, 50+ downstream tasks implementation
Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting
Awesome Chinese LLM: A curated list of Chinese Large Language Model 中文大语言模型数据集和模型资料汇总
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT)...
Official release of InternLM2.5 base and chat models. 1M context support
The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for fo...
A PyTorch-based Python library with UNet architecture and multiple backbones for Image Semantic Segmentation.
[CVPR 2024 Extension] 160K volumes (42M slices) datasets, new segmentation datasets, 31M-1.2B pre-trained models, various pre-training recipes, 50+ downstream tasks implementation
💥 Command line tool for automatic liver parenchyma and liver vessel segmentation in CT using a pretrained deep learning model
Music Audio Representation Benchmark for Universal Evaluation
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT)...
Neural building blocks for speaker diarization: speech activity detection, speaker change detection, overlapped speech detection, speaker embedding
[CVPR 2024 Extension] 160K volumes (42M slices) datasets, new segmentation datasets, 31M-1.2B pre-trained models, various pre-training recipes, 50+ downstream tasks implementation
The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for fo...
A PyTorch-based Python library with UNet architecture and multiple backbones for Image Semantic Segmentation.
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
[CVPR 2024 Extension] 160K volumes (42M slices) datasets, new segmentation datasets, 31M-1.2B pre-trained models, various pre-training recipes, 50+ downstream tasks implementation
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT)...
Superduper: Build end-to-end AI applications and agent workflows on your existing data infrastructure and preferred tools - without migrating your data.
a state-of-the-art-level open visual language model | 多模态预训练模型
Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
GPT4V-level open-source multi-modal model based on Llama3-8B
Superduper: Build end-to-end AI applications and agent workflows on your existing data infrastructure and preferred tools - without migrating your data.