Statistics for topic natural-language-processing
RepositoryStats tracks 579,129 Github repositories, of these 1,402 are tagged with the natural-language-processing topic. The most common primary language for repositories using this topic is Python (783). Other languages include: Jupyter Notebook (212), JavaScript (38), C++ (26), HTML (23), Java (21), TypeScript (18)
Stargazers over time for topic natural-language-processing
Most starred repositories for topic natural-language-processing (view more)
Trending repositories for topic natural-language-processing (view more)
Learn how to design, develop, deploy and iterate on production-grade ML applications.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
Drench yourself in Deep Learning, Reinforcement Learning, Machine Learning, Computer Vision, and NLP by learning from these exciting lectures!!
The official implementation of the paper "Demystifying the Compression of Mixture-of-Experts Through a Unified Framework".
This repository contains an implementation of the LLaMA 2 (Large Language Model Meta AI) model, a Generative Pretrained Transformer (GPT) variant. The implementation focuses on the model architecture ...
Learn how to design, develop, deploy and iterate on production-grade ML applications.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
Drench yourself in Deep Learning, Reinforcement Learning, Machine Learning, Computer Vision, and NLP by learning from these exciting lectures!!
The official implementation of the paper "Demystifying the Compression of Mixture-of-Experts Through a Unified Framework".
This repository contains an implementation of the LLaMA 2 (Large Language Model Meta AI) model, a Generative Pretrained Transformer (GPT) variant. The implementation focuses on the model architecture ...
Code and data releases for the paper -- DelTA: An Online Document-Level Translation Agent Based on Multi-Level Memory
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
ML Nexus is an open-source collection of machine learning projects, covering topics like neural networks, computer vision, and NLP. Whether you're a beginner or expert, contribute, collaborate, and gr...
The official implementation of the paper "What Matters in Transformers? Not All Attention is Needed".
A community-driven collection of RAG (Retrieval-Augmented Generation) frameworks, projects, and resources. Contribute and explore the evolving RAG ecosystem.
ConceptVectors Benchmark and Code for the paper "Intrinsic Evaluation of Unlearning Using Parametric Knowledge Traces"
Fast and efficient unstructured data extraction. Written in Rust with bindings for many languages.
[ICML 2024] LLMCompiler: An LLM Compiler for Parallel Function Calling
Generalist and Lightweight Model for Named Entity Recognition (Extract any entity types from texts) @ NAACL 2024
Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton
A compute framework for building Search, RAG, Recommendations and Analytics over complex (structured+unstructured) data, with ultra-modal vector embeddings.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
Open source libraries and APIs to build custom preprocessing pipelines for labeling, training, or production machine learning pipelines.
📺 Discover the latest machine learning / AI courses on YouTube.
Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton
Awesome-llm-role-playing-with-persona: a curated list of resources for large language models for role-playing with assigned personas
[ICML 2024] LLMCompiler: An LLM Compiler for Parallel Function Calling
This repo contains evaluation code for the paper "MMMU: A Massive Multi-discipline Multimodal Understanding and Reasoning Benchmark for Expert AGI"