cross-encoder-RoBERTa-infoNCE

Paper All Models GitHub

This model is a cross-encoder based on FacebookAI/roberta-base. It was trained on Ms-Marco using loss infoNCE as part of a reproducibility paper for training cross encoders: "Reproducing and Comparing Distillation Techniques for Cross-Encoders", see the paper for more details.

Contents

Model Description

This model is intended for re-ranking the top results returned by a retrieval system (like BM25, Bi-Encoders or SPLADE).

  • Training Data: MS MARCO Passage
  • Language: English
  • Loss infoNCE

Training can be easily reproduced using the assiciated repository. The exact training configuration used for this model is also detailed in config.yaml.

Usage

Quick Start:

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

tokenizer = AutoTokenizer.from_pretrained("xpmir/cross-encoder-RoBERTa-infoNCE")
model = AutoModelForSequenceClassification.from_pretrained("xpmir/cross-encoder-RoBERTa-infoNCE")

features = tokenizer("What is experimaestro ?", "Experimaestro is a powerful framework for ML experiments management...", padding=True, truncation=True, return_tensors="pt")

model.eval()
with torch.no_grad():
    scores = model(**features).logits
    print(scores)

Evaluations

We provide evaluations of this cross-encoder re-ranking the top 1000 documents retrieved by naver/splade-v3-distilbert.

dataset RR@10 nDCG@10
msmarco_dev 38.91 45.72
trec2019 95.35 73.74
trec2020 93.21 72.00
fever 77.82 78.27
arguana 21.78 32.36
climate_fever 26.02 19.42
dbpedia 75.34 44.45
fiqa 48.16 40.59
hotpotqa 86.88 70.72
nfcorpus 54.94 33.58
nq 54.68 59.47
quora 75.73 78.56
scidocs 27.99 15.66
scifact 68.45 71.15
touche 59.24 34.76
trec_covid 91.07 71.85
robust04 70.84 48.48
lotte_writing 70.65 61.44
lotte_recreation 61.61 56.37
lotte_science 46.90 39.01
lotte_technology 56.38 47.20
lotte_lifestyle 74.03 64.38
Mean In Domain 75.82 63.82
BEIR 13 59.08 50.06
LoTTE (OOD) 63.40 52.81
Downloads last month
58
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for xpmir/cross-encoder-RoBERTa-infoNCE

Finetuned
(2156)
this model

Collection including xpmir/cross-encoder-RoBERTa-infoNCE

Paper for xpmir/cross-encoder-RoBERTa-infoNCE