Pointwise Gaggle!
Collection
4 items • Updated • 1
How to use castorini/monot5-3b-msmarco with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("feature-extraction", model="castorini/monot5-3b-msmarco") # Load model directly
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("castorini/monot5-3b-msmarco")
model = AutoModel.from_pretrained("castorini/monot5-3b-msmarco")YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
This model is a T5-3B reranker fine-tuned on the MS MARCO passage dataset for 100k steps (or 10 epochs).
For more details on how to use it, check pygaggle.ai
Paper describing the model: Document Ranking with a Pretrained Sequence-to-Sequence Model