Trending repositories for topic bert
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
This repository contains demos I made with the Transformers library by HuggingFace.
:mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your ...
《李宏毅深度学习教程》(李宏毅老师推荐👍),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E ...
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Sear...
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
we want to create a repo to illustrate usage of transformers in chinese
Transformer related optimization, including BERT, GPT
Pocket-Sized Multimodal AI for content understanding and generation across multilingual texts, images, and 🔜 video, up to 5x faster than OpenAI CLIP and LLaVA 🖼️ & 🖋️
Transformers 3rd Edition
A Unified Library for Parameter-Efficient and Modular Transfer Learning
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Natural Language Processing Tutorial for Deep Learning Researchers
A complete overview and insights into AI-Text detection :seedling: using the powerful BERT(Bi-directional encoder representation transformer) to predict if a text is AI-generated :sunflower: or Human-...
Transformers 3rd Edition
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E ...
Chinese Mandarin Grapheme-to-Phoneme Converter. 中文轉注音或拼音 (INTERSPEECH 2022)
[Survey] Masked Modeling for Self-supervised Representation Learning on Vision and Beyond (https://arxiv.org/abs/2401.00897)
Pocket-Sized Multimodal AI for content understanding and generation across multilingual texts, images, and 🔜 video, up to 5x faster than OpenAI CLIP and LLaVA 🖼️ & 🖋️
Natural Language Processing with Transformers 中译本,最权威Transformers教程
A neural named entity recognition and multi-type normalization tool for biomedical text mining
This repository contains demos I made with the Transformers library by HuggingFace.
we want to create a repo to illustrate usage of transformers in chinese
Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
Awesome-LLM-Eval: a curated list of tools, datasets/benchmark, demos, leaderboard, papers, docs and models, mainly for Evaluation on LLMs. 一个由工具、基准/数据、演示、排行榜和大模型等组成的精选列表,主要面向基础大模型评测,旨在探求生成式AI的技术边界.
多标签文本分类,多标签分类,文本分类, multi-label, classifier, text classification, BERT, seq2seq,attention, multi-label-classification
:mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your ...
《李宏毅深度学习教程》(李宏毅老师推荐👍),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
This repository contains demos I made with the Transformers library by HuggingFace.
:mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your ...
《李宏毅深度学习教程》(李宏毅老师推荐👍),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E ...
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Sear...
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Transformer related optimization, including BERT, GPT
Natural Language Processing Tutorial for Deep Learning Researchers
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
we want to create a repo to illustrate usage of transformers in chinese
深度学习入门课、资深课、特色课、学术案例、产业实践案例、深度学习知识百科及面试题库The course, case and knowledge of Deep Learning and AI
Transformers 3rd Edition
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E ...
A complete overview and insights into AI-Text detection :seedling: using the powerful BERT(Bi-directional encoder representation transformer) to predict if a text is AI-generated :sunflower: or Human-...
Chinese Mandarin Grapheme-to-Phoneme Converter. 中文轉注音或拼音 (INTERSPEECH 2022)
All NLP you Need Here. 目前包含15个NLP demo的pytorch实现(大量代码借鉴于其他开源项目,原先是自己玩的,后来干脆也开源出来)
Highly commented implementations of Transformers in PyTorch
This repository contains demos I made with the Transformers library by HuggingFace.
Natural Language Processing with Transformers 中译本,最权威Transformers教程
[Survey] Masked Modeling for Self-supervised Representation Learning on Vision and Beyond (https://arxiv.org/abs/2401.00897)
code for AAAI2022 paper "Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification"
pytextclassifier is a toolkit for text classification. 文本分类,LR,Xgboost,TextCNN,FastText,TextRNN,BERT等分类模型实现,开箱即用。
Awesome-LLM-Eval: a curated list of tools, datasets/benchmark, demos, leaderboard, papers, docs and models, mainly for Evaluation on LLMs. 一个由工具、基准/数据、演示、排行榜和大模型等组成的精选列表,主要面向基础大模型评测,旨在探求生成式AI的技术边界.
EMNLP 2023 Papers: Explore cutting-edge research from EMNLP 2023, the premier conference for advancing empirical methods in natural language processing. Stay updated on the latest in machine learning,...
Pocket-Sized Multimodal AI for content understanding and generation across multilingual texts, images, and 🔜 video, up to 5x faster than OpenAI CLIP and LLaVA 🖼️ & 🖋️
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
This repository contains demos I made with the Transformers library by HuggingFace.
:mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your ...
《李宏毅深度学习教程》(李宏毅老师推荐👍),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
LLM-PowerHouse: Unleash LLMs' potential through curated tutorials, best practices, and ready-to-use code for custom training and inferencing.
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Sear...
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
Natural Language Processing Tutorial for Deep Learning Researchers
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Transformer related optimization, including BERT, GPT
🧠💬 Articles I wrote about machine learning, archived from MachineCurve.com.
we want to create a repo to illustrate usage of transformers in chinese
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E ...
LLM-PowerHouse: Unleash LLMs' potential through curated tutorials, best practices, and ready-to-use code for custom training and inferencing.
A complete overview and insights into AI-Text detection :seedling: using the powerful BERT(Bi-directional encoder representation transformer) to predict if a text is AI-generated :sunflower: or Human-...
Transformers 3rd Edition
Highly commented implementations of Transformers in PyTorch
Awesome-LLM-Eval: a curated list of tools, datasets/benchmark, demos, leaderboard, papers, docs and models, mainly for Evaluation on LLMs. 一个由工具、基准/数据、演示、排行榜和大模型等组成的精选列表,主要面向基础大模型评测,旨在探求生成式AI的技术边界.
Research and Materials on Hardware implementation of Transformer Model
code for AAAI2022 paper "Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification"
Fast and memory-efficient library for WordPiece tokenization as it is used by BERT.
Large language modeling applied to T-cell receptor (TCR) sequences.
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E ...
This repository contains demos I made with the Transformers library by HuggingFace.
[Survey] Masked Modeling for Self-supervised Representation Learning on Vision and Beyond (https://arxiv.org/abs/2401.00897)
Technical and sentiment analysis to predict the stock market with machine learning models based on historical time series data and news article sentiment collected using APIs and web scraping.
BERT classification model for processing texts longer than 512 tokens. Text is first divided into smaller chunks and after feeding them to BERT, intermediate results are pooled. The implementation all...
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
Highly commented implementations of Transformers in PyTorch
EMNLP 2023 Papers: Explore cutting-edge research from EMNLP 2023, the premier conference for advancing empirical methods in natural language processing. Stay updated on the latest in machine learning,...
A training and inference framework for open ner and re models! 信息抽取模型的统一训练和推理框架,包含丰富的开源SOTA模型
Fast and memory-efficient library for WordPiece tokenization as it is used by BERT.
A complete overview and insights into AI-Text detection :seedling: using the powerful BERT(Bi-directional encoder representation transformer) to predict if a text is AI-generated :sunflower: or Human-...
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
:mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your ...
This repository contains demos I made with the Transformers library by HuggingFace.
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Sear...
《李宏毅深度学习教程》(李宏毅老师推荐👍),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
Transformer related optimization, including BERT, GPT
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
Chinese-LLaMA 1&2、Chinese-Falcon 基础模型;ChatFlow中文对话模型;中文OpenLLaMA模型;NLP预训练/指令微调数据集
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Natural Language Processing Tutorial for Deep Learning Researchers
🧠💬 Articles I wrote about machine learning, archived from MachineCurve.com.
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
🤖 A PyTorch library of curated Transformer models and their composable components
A Unified Library for Parameter-Efficient and Modular Transfer Learning
🤖 A PyTorch library of curated Transformer models and their composable components
Awesome-LLM-Eval: a curated list of tools, datasets/benchmark, demos, leaderboard, papers, docs and models, mainly for Evaluation on LLMs. 一个由工具、基准/数据、演示、排行榜和大模型等组成的精选列表,主要面向基础大模型评测,旨在探求生成式AI的技术边界.
[Survey] Masked Modeling for Self-supervised Representation Learning on Vision and Beyond (https://arxiv.org/abs/2401.00897)
[ACL 2023] Solving Math Word Problems via Cooperative Reasoning induced Language Models
Research and Materials on Hardware implementation of Transformer Model
BERT4ETH: A Pre-trained Transformer for Ethereum Fraud Detection (WWW23)
[IEEE BigData 2022, 5th Workshop on Big Data for CyberSecurity (BigCyber-2022)] Dark patterns in e-commerce: a dataset and its baseline evaluations
We leverage 14 datasets as OOD test data and conduct evaluations on 8 NLU tasks over 21 popularly used models. Our findings confirm that the OOD accuracy in NLP tasks needs to be paid more attention t...
optimized BERT transformer inference on NVIDIA GPU. https://arxiv.org/abs/2210.03052
code for AAAI2022 paper "Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification"
Medical domain-focused GPT-2 fine-tuning, optimization, and lightweighting research repository (compared to GPT-4).
BERT classification model for processing texts longer than 512 tokens. Text is first divided into smaller chunks and after feeding them to BERT, intermediate results are pooled. The implementation all...
A Chinese NLP library based on BERT for sentiment analysis and general-purpose Chinese word segmentation. | 基于 BERT 的中文 NLP 库,用于中文情感倾向分析、通用领域中文分词。