3 results found Sort:
USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference
Created
2024-03-27
216 commits to main branch, last one 3 days ago
InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
Created
2024-01-16
484 commits to develop branch, last one a day ago
The official CLIP training codebase of Inf-CL: "Breaking the Memory Barrier: Near Infinite Batch Size Scaling for Contrastive Loss". A super memory-efficiency CLIP training scheme.
Created
2024-10-16
29 commits to main branch, last one about a month ago