Trending repositories for topic bert
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
This repository contains demos I made with the Transformers library by HuggingFace.
:mag: AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your d...
《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Transformers 3rd Edition
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
we want to create a repo to illustrate usage of transformers in chinese
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Sear...
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
Pocket-Sized Multimodal AI for content understanding and generation across multilingual texts, images, and 🔜 video, up to 5x faster than OpenAI CLIP and LLaVA 🖼️ & 🖋️
自然语言处理领域下的相关论文(附阅读笔记),复现模型以及数据处理等(代码含TensorFlow和PyTorch两版本)
[CVPR 2021] Official PyTorch implementation for Transformer Interpretability Beyond Attention Visualization, a novel method to visualize classifications by Transformer based networks.
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Transformer related optimization, including BERT, GPT
Transformers 3rd Edition
Generalist and Lightweight Model for Relation Extraction (Extract any relationship types from text)
EMNLP 2023 Papers: Explore cutting-edge research from EMNLP 2023, the premier conference for advancing empirical methods in natural language processing. Stay updated on the latest in machine learning,...
Technical and sentiment analysis to predict the stock market with machine learning models based on historical time series data and news article sentiment collected using APIs and web scraping.
Unattended Lightweight Text Classifiers with LLM Embeddings
Distillation of KoBERT from SKTBrain (Lightweight KoBERT)
Pocket-Sized Multimodal AI for content understanding and generation across multilingual texts, images, and 🔜 video, up to 5x faster than OpenAI CLIP and LLaVA 🖼️ & 🖋️
A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
The first-ever vast natural language processing benchmark for Indonesian Language. We provide multiple downstream tasks, pre-trained IndoBERT models, and a starter code! (AACL-IJCNLP 2020)
An NLP system for generating reading comprehension questions
we want to create a repo to illustrate usage of transformers in chinese
This repository contains demos I made with the Transformers library by HuggingFace.
自然语言处理领域下的相关论文(附阅读笔记),复现模型以及数据处理等(代码含TensorFlow和PyTorch两版本)
Code and source for paper ``How to Fine-Tune BERT for Text Classification?``
PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E ...
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
:mag: AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your d...
《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
This repository contains demos I made with the Transformers library by HuggingFace.
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Sear...
Transformers 3rd Edition
Natural Language Processing Tutorial for Deep Learning Researchers
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
we want to create a repo to illustrate usage of transformers in chinese
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Transformer related optimization, including BERT, GPT
🧠💬 Articles I wrote about machine learning, archived from MachineCurve.com.
Transformers 3rd Edition
Generalist and Lightweight Model for Relation Extraction (Extract any relationship types from text)
A complete overview and insights into AI-Text detection :seedling: using the powerful BERT(Bi-directional encoder representation transformer) to predict if a text is AI-generated :sunflower: or Human-...
EMNLP 2023 Papers: Explore cutting-edge research from EMNLP 2023, the premier conference for advancing empirical methods in natural language processing. Stay updated on the latest in machine learning,...
Technical and sentiment analysis to predict the stock market with machine learning models based on historical time series data and news article sentiment collected using APIs and web scraping.
Retrieve, Read and LinK: Fast and Accurate Entity Linking and Relation Extraction on an Academic Budget (ACL 2024)
An NLP system for generating reading comprehension questions
Using Bert/Roberta + LSTM/GRU/BiLSTM/TextCNN to do the sentiment analysis on the imdb datasets.
实体关系抽取,使用了百度比赛的数据集。使用pytorch实现MultiHeadJointEntityRelationExtraction,包含Bert、Albert、gru的使用,并且添加了对抗训练。最后使用Flask和Neo4j图数据库对模型进行了部署
A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
[Survey] Masked Modeling for Self-supervised Representation Learning on Vision and Beyond (https://arxiv.org/abs/2401.00897)
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
CodeBERTScore: an automatic metric for code generation, based on BERTScore
Unattended Lightweight Text Classifiers with LLM Embeddings
Distillation of KoBERT from SKTBrain (Lightweight KoBERT)
The Official Repo for "Quick Start Guide to Large Language Models"
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
:mag: AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your d...
《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
This repository contains demos I made with the Transformers library by HuggingFace.
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
we want to create a repo to illustrate usage of transformers in chinese
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Sear...
Natural Language Processing Tutorial for Deep Learning Researchers
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Transformer related optimization, including BERT, GPT
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
深度学习入门课、资深课、特色课、学术案例、产业实践案例、深度学习知识百科及面试题库The course, case and knowledge of Deep Learning and AI
🧠💬 Articles I wrote about machine learning, archived from MachineCurve.com.
[ICML'24 Oral] APT: Adaptive Pruning and Tuning Pretrained Language Models for Efficient Training and Inference
Generalist and Lightweight Model for Relation Extraction (Extract any relationship types from text)
Use Large Language Models like OpenAI's GPT-3.5 for data annotation and model enhancement. This framework combines human expertise with LLMs, employs Iterative Active Learning for continuous improveme...
Retrieve, Read and LinK: Fast and Accurate Entity Linking and Relation Extraction on an Academic Budget (ACL 2024)
A complete overview and insights into AI-Text detection :seedling: using the powerful BERT(Bi-directional encoder representation transformer) to predict if a text is AI-generated :sunflower: or Human-...
Transformers 3rd Edition
Technical and sentiment analysis to predict the stock market with machine learning models based on historical time series data and news article sentiment collected using APIs and web scraping.
Preprint: Asymmetry in Low-Rank Adapters of Foundation Models
The Official Repo for "Quick Start Guide to Large Language Models"
AI-Generated Text Detection: A BERT-powered solution for accurately identifying AI-generated text. Seamlessly integrated, highly accurate, and user-friendly.🚀
Research and Materials on Hardware implementation of Transformer Model
Basic implementation of the life2vec model with the dummy data.
Fast and memory-efficient library for WordPiece tokenization as it is used by BERT.
Dataset associated with "BOLD: Dataset and Metrics for Measuring Biases in Open-Ended Language Generation" paper
基于PyTorch的BERT中文文本分类模型(BERT Chinese text classification model implemented by PyTorch)
EMNLP 2023 Papers: Explore cutting-edge research from EMNLP 2023, the premier conference for advancing empirical methods in natural language processing. Stay updated on the latest in machine learning,...
Retrieve, Read and LinK: Fast and Accurate Entity Linking and Relation Extraction on an Academic Budget (ACL 2024)
EMNLP 2023 Papers: Explore cutting-edge research from EMNLP 2023, the premier conference for advancing empirical methods in natural language processing. Stay updated on the latest in machine learning,...
Generalist and Lightweight Model for Relation Extraction (Extract any relationship types from text)
A complete overview and insights into AI-Text detection :seedling: using the powerful BERT(Bi-directional encoder representation transformer) to predict if a text is AI-generated :sunflower: or Human-...
AI-Generated Text Detection: A BERT-powered solution for accurately identifying AI-generated text. Seamlessly integrated, highly accurate, and user-friendly.🚀
Developed BERT, LSTM, TFIDF, and Word2Vec models to analyze social media data, extracting service aspects and sentiments from a custom dataset. Provided actionable insights to telecom operators for cu...
[ICML'24 Oral] APT: Adaptive Pruning and Tuning Pretrained Language Models for Efficient Training and Inference
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
:mag: AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your d...
《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
This repository contains demos I made with the Transformers library by HuggingFace.
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Sear...
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
Transformer related optimization, including BERT, GPT
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Natural Language Processing Tutorial for Deep Learning Researchers
we want to create a repo to illustrate usage of transformers in chinese
🧠💬 Articles I wrote about machine learning, archived from MachineCurve.com.
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
A Unified Library for Parameter-Efficient and Modular Transfer Learning
LLM-PowerHouse: Unleash LLMs' potential through curated tutorials, best practices, and ready-to-use code for custom training and inferencing.
The Official Repo for "Quick Start Guide to Large Language Models"
📚 Text Classification with LoRA (Low-Rank Adaptation) of Language Models - Efficiently fine-tune large language models for text classification tasks using the Stanford Sentiment Treebank (SST-2) data...
Unattended Lightweight Text Classifiers with LLM Embeddings
文本相似度,语义向量,文本向量,text-similarity,similarity, sentence-similarity,BERT,SimCSE,BERT-Whitening,Sentence-BERT, PromCSE, SBERT
[IEEE BigData 2022, 5th Workshop on Big Data for CyberSecurity (BigCyber-2022)] Dark patterns in e-commerce: a dataset and its baseline evaluations
Awesome-LLM-Eval: a curated list of tools, datasets/benchmark, demos, leaderboard, papers, docs and models, mainly for Evaluation on LLMs. 一个由工具、基准/数据、演示、排行榜和大模型等组成的精选列表,主要面向基础大模型评测,旨在探求生成式AI的技术边界.
Research and Materials on Hardware implementation of Transformer Model
Vocabulary Trimming (VT) is a model compression technique, which reduces a multilingual LM vocabulary to a target language by deleting irrelevant tokens from its vocabulary. This repository contains a...
BERT4ETH: A Pre-trained Transformer for Ethereum Fraud Detection (WWW23)
Highly commented implementations of Transformers in PyTorch
code for AAAI2022 paper "Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification"
[Survey] Masked Modeling for Self-supervised Representation Learning on Vision and Beyond (https://arxiv.org/abs/2401.00897)
BERT classification model for processing texts longer than 512 tokens. Text is first divided into smaller chunks and after feeding them to BERT, intermediate results are pooled. The implementation all...
Using Bert/Roberta + LSTM/GRU/BiLSTM/TextCNN to do the sentiment analysis on the imdb datasets.