3 results found Sort:
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)
Created
2023-07-24
212 commits to main branch, last one 5 months ago
Offical Repo for "Programming Every Example: Lifting Pre-training Data Quality Like Experts at Scale"
Created
2024-09-09
8 commits to main branch, last one 2 months ago
Llama-3-SynE: A Significantly Enhanced Version of Llama-3 with Advanced Scientific Reasoning and Chinese Language Capabilities | 继续预训练提升 Llama-3 的科学推理和中文能力
Created
2024-07-24
11 commits to main branch, last one 4 days ago