Statistics for topic natural-language-processing
RepositoryStats tracks 662,837 Github repositories, of these 1,507 are tagged with the natural-language-processing topic. The most common primary language for repositories using this topic is Python (841). Other languages include: Jupyter Notebook (234), JavaScript (40), C++ (26), HTML (25), Java (21), TypeScript (21), Go (11), C# (11)
Stargazers over time for topic natural-language-processing
Most starred repositories for topic natural-language-processing (view more)
Trending repositories for topic natural-language-processing (view more)
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
🚀 Efficient implementations of state-of-the-art linear attention models in Torch and Triton
🐫 CAMEL: The first and the best multi-agent framework. Finding the Scaling Law of Agents. https://www.camel-ai.org
A Minecraft MCP Server powered by Mineflayer API. It allows to control a Minecraft character in real-time, allowing AI assistants to build structures, explore the world, and interact with the game env...
🚀 Efficient implementations of state-of-the-art linear attention models in Torch and Triton
A community-driven collection of RAG (Retrieval-Augmented Generation) frameworks, projects, and resources. Contribute and explore the evolving RAG ecosystem.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
The official repository of the dots.llm1 base and instruct models proposed by rednote-hilab.
Learn how to design, develop, deploy and iterate on production-grade ML applications.
🚀 Efficient implementations of state-of-the-art linear attention models in Torch and Triton
The official repository of the dots.llm1 base and instruct models proposed by rednote-hilab.
A Minecraft MCP Server powered by Mineflayer API. It allows to control a Minecraft character in real-time, allowing AI assistants to build structures, explore the world, and interact with the game env...
🚀 Efficient implementations of state-of-the-art linear attention models in Torch and Triton
The official repository of the dots.llm1 base and instruct models proposed by rednote-hilab.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
🐫 CAMEL: The first and the best multi-agent framework. Finding the Scaling Law of Agents. https://www.camel-ai.org
Learn how to design, develop, deploy and iterate on production-grade ML applications.
ACL'2025: SoftCoT: Soft Chain-of-Thought for Efficient Reasoning with LLMs. and preprint: SoftCoT++: Test-Time Scaling with Soft Chain-of-Thought Reasoning
The official repository of the dots.llm1 base and instruct models proposed by rednote-hilab.
[LREC-COLING 2024 (Oral), Interspeech 2024 (Oral), NAACL 2025, ACL 2025] A Series of Multilingual Multitask Medical Speech Processing
Cutting-edge AI solution for Home Assistant. Multi-LLM provider support to transform your smart home experience with intelligent, adaptive automation.
🏝️ OASIS: Open Agent Social Interaction Simulations with One Million Agents.
Synthetic data curation for post-training and structured data extraction
This series will take you on a journey from the fundamentals of NLP and Computer Vision to the cutting edge of Vision-Language Models.
A community-driven collection of RAG (Retrieval-Augmented Generation) frameworks, projects, and resources. Contribute and explore the evolving RAG ecosystem.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
🐫 CAMEL: The first and the best multi-agent framework. Finding the Scaling Law of Agents. https://www.camel-ai.org
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
The official implementation of the paper "What Matters in Transformers? Not All Attention is Needed".
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
This series will take you on a journey from the fundamentals of NLP and Computer Vision to the cutting edge of Vision-Language Models.