[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
-
Updated
Feb 4, 2024 - Python
[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
Repository for My HuggingFace Natural Language Processing Projects
PyTorch implementation for "Training and Inference on Any-Order Autoregressive Models the Right Way", NeurIPS 2022 Oral, TPM 2023 Best Paper Honorable Mention
Implementation of Transformer Encoders / Masked Language Modeling Objective
11th place solution of NeurIPS 2024 - Predict New Medicines with BELKA competition on Kaggle: https://www.kaggle.com/competitions/leash-BELKA
Measuring Biases in Masked Language Models for PyTorch Transformers. Support for multiple social biases and evaluation measures.
Continue T5 MLM pre-training on verbalized ConceptNet and fine-tune for commonsense question-answering
Transformer for Automatic Speech Recognition
Code for the publication of WWW'22
Pre-training a Transformer from scratch.
Evaluation of zero-shot classification models on Turkish datasets.
External Knowledge Infusion using INCIDecoder into BERT for Chemical Mapping
Use BERTRAM to get single-token embeddings for idioms on the MAGPIE dataset.
🗨️ This repository contains a collection of notebooks and resources for various NLP tasks using different architectures and frameworks.
A Pre-trained Language Model for Semantic Similarity Measurement of Persian Informal Short Texts
Course materials for the Machine Learning for NLP course taught by Sameer Singh for the Cognitive Science summer school 2022.
Comparing Data-Driven Techniques for Enhancing Negation Sensitivity in MLM-Based Laguage-Models
Customized Pretraining for NLG Tasks
Source codes and materials of Advanced Spelling Error Correction project.
Masked Language Modeling demo using XLM-RoBERTa + Gradio/FastAPI
Add a description, image, and links to the masked-language-modeling topic page so that developers can more easily learn about it.
To associate your repository with the masked-language-modeling topic, visit your repo's landing page and select "manage topics."