SimCSE: Simple Contrastive Learning of Sentence Embeddings Paper โข 2104.08821 โข Published Apr 18, 2021
RoBERTa: A Robustly Optimized BERT Pretraining Approach Paper โข 1907.11692 โข Published Jul 26, 2019 โข 10
# Load model directly from transformers import AutoTokenizer, RobertaForCL tokenizer = AutoTokenizer.from_pretrained("gomgomcode/material_patent_roberta_simcse") model = RobertaForCL.from_pretrained("gomgomcode/material_patent_roberta_simcse")