PyLate model based on distilbert/distilroberta-base
This is a PyLate model finetuned from distilbert/distilroberta-base on the msmarco-train dataset. It maps sentences & paragraphs to sequences of 128-dimensional dense vectors and can be used for semantic textual similarity using the MaxSim operator.
Model Details
Model Description
- Model Type: PyLate model
- Base model: distilbert/distilroberta-base
- Document Length: 256 tokens
- Query Length: 32 tokens
- Output Dimensionality: 128 tokens
- Similarity Function: MaxSim
- Training Dataset:
Model Sources
- Documentation: PyLate Documentation
- Repository: PyLate on GitHub
- Hugging Face: PyLate models on Hugging Face
Full Model Architecture
ColBERT(
(0): Transformer({'max_seq_length': 255, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Dense({'in_features': 768, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
)
Usage
First install the PyLate library:
pip install -U pylate
Retrieval
PyLate provides a streamlined interface to index and retrieve documents using ColBERT models. The index leverages the Voyager HNSW index to efficiently handle document embeddings and enable fast retrieval.
Indexing documents
First, load the ColBERT model and initialize the Voyager index, then encode and index your documents:
from pylate import indexes, models, retrieve
# Step 1: Load the ColBERT model
model = models.ColBERT(
model_name_or_path=yosefw/colbert-distilroberta-base,
)
# Step 2: Initialize the Voyager index
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
override=True, # This overwrites the existing index if any
)
# Step 3: Encode the documents
documents_ids = ["1", "2", "3"]
documents = ["document 1 text", "document 2 text", "document 3 text"]
documents_embeddings = model.encode(
documents,
batch_size=32,
is_query=False, # Ensure that it is set to False to indicate that these are documents, not queries
show_progress_bar=True,
)
# Step 4: Add document embeddings to the index by providing embeddings and corresponding ids
index.add_documents(
documents_ids=documents_ids,
documents_embeddings=documents_embeddings,
)
Note that you do not have to recreate the index and encode the documents every time. Once you have created an index and added the documents, you can re-use the index later by loading it:
# To load an index, simply instantiate it with the correct folder/name and without overriding it
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
)
Retrieving top-k documents for queries
Once the documents are indexed, you can retrieve the top-k most relevant documents for a given set of queries. To do so, initialize the ColBERT retriever with the index you want to search in, encode the queries and then retrieve the top-k documents to get the top matches ids and relevance scores:
# Step 1: Initialize the ColBERT retriever
retriever = retrieve.ColBERT(index=index)
# Step 2: Encode the queries
queries_embeddings = model.encode(
["query for document 3", "query for document 1"],
batch_size=32,
is_query=True, # # Ensure that it is set to False to indicate that these are queries
show_progress_bar=True,
)
# Step 3: Retrieve top-k documents
scores = retriever.retrieve(
queries_embeddings=queries_embeddings,
k=10, # Retrieve the top 10 matches for each query
)
Reranking
If you only want to use the ColBERT model to perform reranking on top of your first-stage retrieval pipeline without building an index, you can simply use rank function and pass the queries and documents to rerank:
from pylate import rank, models
queries = [
"query A",
"query B",
]
documents = [
["document A", "document B"],
["document 1", "document C", "document B"],
]
documents_ids = [
[1, 2],
[1, 3, 2],
]
model = models.ColBERT(
model_name_or_path=yosefw/colbert-distilroberta-base,
)
queries_embeddings = model.encode(
queries,
is_query=True,
)
documents_embeddings = model.encode(
documents,
is_query=False,
)
reranked_documents = rank.rerank(
documents_ids=documents_ids,
queries_embeddings=queries_embeddings,
documents_embeddings=documents_embeddings,
)
Evaluation
Metrics
Col BERTTriplet
- Evaluated with
pylate.evaluation.colbert_triplet.ColBERTTripletEvaluator
Metric | Value |
---|---|
accuracy | 0.7992 |
Training Details
Training Dataset
msmarco-train
- Dataset: msmarco-train at 6853021
- Size: 500,000 training samples
- Columns:
query_id
,query
,positive
,negative_1
,negative_2
,negative_3
, andnegative_4
- Approximate statistics based on the first 1000 samples:
query_id query positive negative_1 negative_2 negative_3 negative_4 type int string string string string string string details - 13501: ~0.20%
- 13502: ~0.20%
- 13503: ~0.20%
- 13506: ~0.20%
- 13508: ~0.20%
- 13510: ~0.20%
- 13515: ~0.20%
- 13517: ~0.20%
- 13520: ~0.20%
- 13525: ~0.20%
- 13528: ~0.20%
- 13529: ~0.20%
- 13531: ~0.20%
- 13533: ~0.20%
- 13535: ~0.20%
- 13536: ~0.20%
- 13537: ~0.20%
- 13539: ~0.20%
- 13540: ~0.20%
- 13541: ~0.20%
- 13542: ~0.20%
- 13543: ~0.20%
- 13545: ~0.20%
- 13550: ~0.20%
- 13551: ~0.20%
- 13562: ~0.20%
- 13565: ~0.20%
- 13566: ~0.20%
- 13567: ~0.20%
- 13568: ~0.20%
- 13569: ~0.20%
- 13570: ~0.20%
- 13571: ~0.20%
- 13573: ~0.20%
- 13574: ~0.20%
- 13575: ~0.20%
- 13576: ~0.20%
- 13580: ~0.20%
- 13581: ~0.20%
- 13583: ~0.20%
- 13585: ~0.20%
- 13589: ~0.20%
- 13590: ~0.20%
- 13593: ~0.20%
- 13595: ~0.20%
- 13596: ~0.20%
- 13597: ~0.20%
- 13599: ~0.20%
- 13604: ~0.20%
- 13609: ~0.20%
- 13611: ~0.20%
- 13613: ~0.20%
- 13618: ~0.20%
- 13619: ~0.20%
- 13620: ~0.20%
- 13622: ~0.20%
- 13623: ~0.20%
- 13624: ~0.20%
- 13626: ~0.20%
- 13627: ~0.20%
- 13628: ~0.20%
- 13629: ~0.20%
- 13632: ~0.20%
- 13636: ~0.20%
- 13637: ~0.20%
- 13638: ~0.20%
- 13642: ~0.20%
- 13644: ~0.20%
- 13649: ~0.20%
- 13653: ~0.20%
- 13656: ~0.20%
- 13664: ~0.20%
- 13675: ~0.20%
- 13678: ~0.20%
- 13679: ~0.20%
- 13680: ~0.20%
- 13682: ~0.20%
- 13687: ~0.20%
- 13688: ~0.20%
- 13690: ~0.20%
- 13692: ~0.20%
- 13693: ~0.20%
- 13695: ~0.20%
- 13696: ~0.20%
- 13698: ~0.20%
- 13700: ~0.20%
- 13701: ~0.20%
- 13702: ~0.20%
- 13704: ~0.20%
- 13705: ~0.20%
- 13706: ~0.20%
- 13715: ~0.20%
- 13725: ~0.20%
- 13726: ~0.20%
- 13732: ~0.20%
- 13736: ~0.20%
- 13737: ~0.20%
- 13738: ~0.20%
- 13740: ~0.20%
- 13741: ~0.20%
- 13743: ~0.20%
- 13745: ~0.20%
- 13746: ~0.20%
- 13748: ~0.20%
- 13750: ~0.20%
- 13751: ~0.20%
- 13752: ~0.20%
- 13753: ~0.20%
- 13754: ~0.20%
- 13755: ~0.20%
- 13757: ~0.20%
- 13758: ~0.20%
- 13759: ~0.20%
- 13762: ~0.20%
- 13764: ~0.20%
- 13767: ~0.20%
- 13768: ~0.20%
- 13769: ~0.20%
- 13771: ~0.20%
- 13772: ~0.20%
- 13774: ~0.20%
- 13775: ~0.20%
- 13778: ~0.20%
- 13782: ~0.20%
- 13783: ~0.20%
- 13785: ~0.20%
- 13787: ~0.20%
- 13789: ~0.20%
- 13790: ~0.20%
- 13791: ~0.20%
- 13795: ~0.20%
- 13798: ~0.20%
- 13799: ~0.20%
- 13802: ~0.20%
- 13803: ~0.20%
- 13805: ~0.20%
- 13806: ~0.20%
- 13810: ~0.20%
- 13812: ~0.20%
- 13814: ~0.20%
- 13822: ~0.20%
- 13826: ~0.20%
- 13829: ~0.20%
- 13832: ~0.20%
- 13835: ~0.20%
- 13837: ~0.20%
- 13841: ~0.20%
- 13842: ~0.20%
- 13844: ~0.20%
- 13850: ~0.20%
- 13851: ~0.20%
- 13852: ~0.20%
- 13857: ~0.20%
- 13859: ~0.20%
- 13860: ~0.20%
- 13862: ~0.20%
- 13863: ~0.20%
- 13867: ~0.20%
- 13873: ~0.20%
- 13874: ~0.20%
- 13876: ~0.20%
- 13879: ~0.20%
- 13880: ~0.20%
- 13882: ~0.20%
- 13883: ~0.20%
- 13887: ~0.20%
- 13890: ~0.20%
- 13894: ~0.20%
- 13896: ~0.20%
- 13897: ~0.20%
- 13898: ~0.20%
- 13899: ~0.20%
- 13903: ~0.20%
- 13911: ~0.20%
- 13914: ~0.20%
- 13915: ~0.20%
- 13923: ~0.20%
- 13924: ~0.20%
- 13927: ~0.20%
- 13930: ~0.20%
- 13931: ~0.20%
- 13933: ~0.20%
- 13936: ~0.20%
- 13940: ~0.20%
- 13944: ~0.20%
- 13947: ~0.20%
- 13948: ~0.20%
- 13949: ~0.20%
- 13955: ~0.20%
- 13956: ~0.20%
- 13958: ~0.20%
- 13959: ~0.20%
- 13964: ~0.20%
- 13966: ~0.20%
- 13968: ~0.20%
- 13971: ~0.20%
- 13974: ~0.20%
- 13975: ~0.20%
- 13976: ~0.20%
- 13983: ~0.20%
- 13984: ~0.20%
- 13986: ~0.20%
- 13990: ~0.20%
- 13996: ~0.20%
- 14000: ~0.20%
- 14001: ~0.20%
- 14002: ~0.20%
- 14003: ~0.20%
- 14004: ~0.20%
- 14006: ~0.20%
- 14007: ~0.20%
- 14010: ~0.20%
- 14012: ~0.20%
- 14015: ~0.20%
- 14016: ~0.20%
- 14017: ~0.20%
- 14019: ~0.20%
- 14020: ~0.20%
- 14022: ~0.20%
- 14025: ~0.20%
- 14029: ~0.20%
- 14035: ~0.20%
- 14036: ~0.20%
- 14043: ~0.20%
- 14045: ~0.20%
- 14047: ~0.20%
- 14049: ~0.20%
- 14050: ~0.20%
- 14053: ~0.20%
- 14055: ~0.20%
- 14061: ~0.20%
- 14062: ~0.20%
- 14063: ~0.20%
- 14068: ~0.20%
- 14072: ~0.20%
- 14075: ~0.20%
- 14076: ~0.20%
- 14077: ~0.20%
- 14078: ~0.20%
- 14079: ~0.20%
- 14080: ~0.20%
- 14081: ~0.20%
- 14082: ~0.20%
- 14083: ~0.20%
- 14086: ~0.20%
- 14087: ~0.20%
- 14088: ~0.20%
- 14091: ~0.20%
- 14092: ~0.20%
- 14101: ~0.20%
- 14102: ~0.20%
- 14103: ~0.20%
- 14104: ~0.20%
- 14105: ~0.20%
- 14111: ~0.20%
- 14112: ~0.20%
- 14114: ~0.20%
- 14115: ~0.20%
- 14117: ~0.20%
- 14124: ~0.20%
- 14125: ~0.20%
- 14133: ~0.20%
- 14135: ~0.20%
- 14137: ~0.20%
- 14138: ~0.20%
- 14139: ~0.20%
- 14140: ~0.20%
- 14141: ~0.20%
- 14142: ~0.20%
- 14144: ~0.20%
- 14145: ~0.20%
- 14149: ~0.20%
- 14150: ~0.20%
- 14152: ~0.20%
- 14154: ~0.20%
- 14157: ~0.20%
- 14161: ~0.20%
- 14163: ~0.20%
- 14164: ~0.20%
- 14171: ~0.20%
- 14174: ~0.20%
- 14175: ~0.20%
- 14180: ~0.20%
- 14181: ~0.20%
- 14183: ~0.20%
- 14185: ~0.20%
- 14186: ~0.20%
- 14187: ~0.20%
- 14190: ~0.20%
- 14191: ~0.20%
- 14192: ~0.20%
- 14193: ~0.20%
- 14195: ~0.20%
- 14196: ~0.20%
- 14197: ~0.20%
- 14198: ~0.20%
- 14199: ~0.20%
- 14200: ~0.20%
- 14202: ~0.20%
- 14204: ~0.20%
- 14205: ~0.20%
- 14206: ~0.20%
- 14208: ~0.20%
- 14209: ~0.20%
- 14215: ~0.20%
- 14216: ~0.20%
- 14218: ~0.20%
- 14219: ~0.20%
- 14221: ~0.20%
- 14223: ~0.20%
- 14225: ~0.20%
- 14226: ~0.20%
- 14228: ~0.20%
- 14229: ~0.20%
- 14230: ~0.20%
- 14231: ~0.20%
- 14232: ~0.20%
- 14233: ~0.20%
- 14234: ~0.20%
- 14235: ~0.20%
- 14237: ~0.20%
- 14241: ~0.20%
- 14242: ~0.20%
- 14243: ~0.20%
- 14246: ~0.20%
- 14248: ~0.20%
- 14249: ~0.20%
- 14250: ~0.20%
- 14252: ~0.20%
- 14253: ~0.20%
- 14256: ~0.20%
- 14258: ~0.20%
- 14260: ~0.20%
- 14261: ~0.20%
- 14262: ~0.20%
- 14263: ~0.20%
- 14264: ~0.20%
- 14268: ~0.20%
- 14269: ~0.20%
- 14270: ~0.20%
- 14271: ~0.20%
- 14272: ~0.20%
- 14273: ~0.20%
- 14279: ~0.20%
- 14280: ~0.20%
- 14281: ~0.20%
- 14283: ~0.20%
- 14284: ~0.20%
- 14285: ~0.20%
- 14287: ~0.20%
- 14289: ~0.20%
- 14292: ~0.20%
- 14293: ~0.20%
- 14295: ~0.20%
- 14296: ~0.20%
- 14297: ~0.20%
- 14299: ~0.20%
- 14300: ~0.20%
- 14303: ~0.20%
- 14305: ~0.20%
- 14306: ~0.20%
- 14307: ~0.20%
- 14308: ~0.20%
- 14309: ~0.20%
- 14310: ~0.20%
- 14311: ~0.20%
- 14313: ~0.20%
- 14316: ~0.20%
- 14320: ~0.20%
- 14321: ~0.20%
- 14323: ~0.20%
- 14324: ~0.20%
- 14326: ~0.20%
- 14327: ~0.20%
- 14328: ~0.20%
- 14329: ~0.20%
- 14330: ~0.20%
- 14331: ~0.20%
- 14333: ~0.20%
- 14334: ~0.20%
- 14335: ~0.20%
- 14340: ~0.20%
- 14345: ~0.20%
- 14346: ~0.20%
- 14350: ~0.20%
- 14352: ~0.20%
- 14355: ~0.20%
- 14357: ~0.20%
- 14358: ~0.20%
- 14359: ~0.20%
- 14362: ~0.20%
- 14365: ~0.20%
- 14366: ~0.20%
- 14369: ~0.20%
- 14372: ~0.20%
- 14374: ~0.20%
- 14376: ~0.20%
- 14377: ~0.20%
- 14378: ~0.20%
- 14380: ~0.20%
- 14381: ~0.20%
- 14382: ~0.20%
- 14384: ~0.20%
- 14387: ~0.20%
- 14388: ~0.20%
- 14390: ~0.20%
- 14391: ~0.20%
- 14394: ~0.20%
- 14395: ~0.20%
- 14399: ~0.20%
- 14405: ~0.20%
- 14406: ~0.20%
- 14407: ~0.20%
- 14408: ~0.20%
- 14409: ~0.20%
- 14413: ~0.20%
- 14414: ~0.20%
- 14415: ~0.20%
- 14416: ~0.20%
- 14417: ~0.20%
- 14418: ~0.20%
- 14420: ~0.20%
- 14421: ~0.20%
- 14423: ~0.20%
- 14424: ~0.20%
- 14427: ~0.20%
- 14428: ~0.20%
- 14433: ~0.20%
- 14435: ~0.20%
- 14437: ~0.20%
- 14439: ~0.20%
- 14440: ~0.20%
- 14444: ~0.20%
- 14445: ~0.20%
- 14450: ~0.20%
- 14452: ~0.20%
- 14453: ~0.20%
- 14454: ~0.20%
- 14455: ~0.20%
- 14457: ~0.20%
- 14458: ~0.20%
- 14459: ~0.20%
- 14461: ~0.20%
- 14463: ~0.20%
- 14464: ~0.20%
- 14466: ~0.20%
- 14467: ~0.20%
- 14468: ~0.20%
- 14469: ~0.20%
- 14472: ~0.20%
- 14473: ~0.20%
- 14475: ~0.20%
- 14480: ~0.20%
- 14484: ~0.20%
- 14488: ~0.20%
- 14490: ~0.20%
- 14491: ~0.20%
- 14492: ~0.20%
- 14493: ~0.20%
- 14494: ~0.20%
- 14499: ~0.20%
- 14501: ~0.20%
- 14508: ~0.20%
- 14509: ~0.20%
- 14510: ~0.20%
- 14512: ~0.20%
- 14514: ~0.20%
- 14516: ~0.20%
- 14518: ~0.20%
- 14519: ~0.20%
- 14520: ~0.20%
- 14523: ~0.20%
- 14524: ~0.20%
- 14526: ~0.20%
- 14527: ~0.20%
- 14529: ~0.20%
- 14530: ~0.20%
- 14532: ~0.20%
- 14536: ~0.20%
- 14537: ~0.20%
- 14538: ~0.20%
- 14539: ~0.20%
- 14540: ~0.20%
- 14543: ~0.20%
- 14544: ~0.20%
- 14545: ~0.20%
- 14547: ~0.20%
- 14553: ~0.20%
- 14562: ~0.20%
- 14567: ~0.20%
- 14569: ~0.20%
- 14570: ~0.20%
- 14573: ~0.20%
- 14576: ~0.20%
- 14577: ~0.20%
- 14578: ~0.20%
- 14579: ~0.20%
- 14581: ~0.20%
- 14584: ~0.20%
- 14588: ~0.20%
- min: 5 tokens
- mean: 8.97 tokens
- max: 19 tokens
- min: 17 tokens
- mean: 31.88 tokens
- max: 32 tokens
- min: 20 tokens
- mean: 31.97 tokens
- max: 32 tokens
- min: 20 tokens
- mean: 31.95 tokens
- max: 32 tokens
- min: 24 tokens
- mean: 31.98 tokens
- max: 32 tokens
- min: 24 tokens
- mean: 31.96 tokens
- max: 32 tokens
- Samples:
query_id query positive negative_1 negative_2 negative_3 negative_4 13501
age of cats first heat cycle
Cats can have their first heat cycle between 4-6 months of age and will go in to heat approximately 3 times a year. Each litter typically has between 4 and 6 kittens. Dogs go into heat at about 4-6 months of age and typically have two heat cycles per year. Each litter has, on average, between 4 and 10 puppies. A dog’s heat cycle lasts a total of 30 days. The vagina will swell during the first 10 days. There will be bloody discharge during the next 10 days.
1 Cats can become pregnant during their first heat cycle, and they do not discriminate when it comes to finding an available male — they will mate with their parents or siblings. Cats can go back into heat soon after giving birth, according to Dr. Debra Primovic, DVM.
How Early Can My Cat or Dog Get Pregnant? The practice of early age spay/neuter, which was endorsed by the American Veterinary Medical Association (AVMA) in 2006, generally refers to dogs or cats which are at least two pounds and/ or two months of age at the time they are altered.
1 It does add to pet overpopulation. 2 Cats can become pregnant during their first heat cycle, and they do not discriminate when it comes to finding an available male — they will mate with their parents or siblings. Cats can go back into heat soon after giving birth, according to Dr. Debra Primovic, DVM.
Have your cat spayed to prevent reproduction. You can spay as early as 8 weeks of age, but check with your veterinarian. If a cat is already in heat, some vets may wait until the cycle is over before performing the spay. Cats do not need to have a litter of kittens before they are spayed.
13501
age of cats first heat cycle
Cats can have their first heat cycle between 4-6 months of age and will go in to heat approximately 3 times a year. Each litter typically has between 4 and 6 kittens. Dogs go into heat at about 4-6 months of age and typically have two heat cycles per year. Each litter has, on average, between 4 and 10 puppies. A dog’s heat cycle lasts a total of 30 days. The vagina will swell during the first 10 days. There will be bloody discharge during the next 10 days.
Also any accidental escape can result in a pregnancy, unlike a dog that must already be “in heat,” to be at risk of pregnancy when she gets loose. The age at which dogs begin to come into estrus varies with size and breed, however many dogs can become pregnant at five months.
How young can my dog or cat get pregnant? This question is especially important for those who share their lives with cats. Cats can actually become pregnant as young as four months of age, having a litter when they are six months old.
No documented problems make early age sterilization, or sterilization before the first heat cycle, ill advised, especially when contrasted with the significant dangers of mammary tumors, pyometra or the tragedy of homelessness.
Have your cat spayed to prevent reproduction. You can spay as early as 8 weeks of age, but check with your veterinarian. If a cat is already in heat, some vets may wait until the cycle is over before performing the spay. Remember: Cats do not need to have a litter of kittens before they are spayed.
13502
age of cecily tynan
Cecily Tynan was born on the 19th of March 1969, which was a Wednesday. Cecily Tynan will be turning 49 in only 332 days from today. Cecily Tynan is 48 years old. To be more precise (and nerdy), the current age as of right now is 17521 days or (even more geeky) 420504 hours. That's a lot of hours!
Inducted into the Black Filmmakers Hall of Fame in 1977. Pictured on a $3.25 postage stamp issued by the island of Nevis on 1 January 2014. Was misreported as being a decade younger than she actually was until The New York Times found out her real age in 2013.
Mini Bio (1) Cicely Tyson was born in Harlem, New York City, where she was raised by her devoutly religious parents, from the Caribbean island of Nevis. Her mother, Theodosia, was a domestic, and her father, William Tyson, was a carpenter and painter.
0 Affair 2 Married 0 Children. An American television reporter who is currently working for WPVI-TV. She is also widely known as the host of Saturday evening public affairs program named Primetime Weekend. She achieved degree in journalism and politics in 1991 from Washington and Lee University.
Facts of Cecily Tynan. Cecily Tynan has been one of the top level TV reported for long period of time and she has managed to do so with the help of her immense talent and passion for her job. She was born in the year 1969 on 19th of March and this makes her 45 years of age right now.
- Loss:
pylate.losses.contrastive.Contrastive
Evaluation Dataset
msmarco-train
- Dataset: msmarco-train at 6853021
- Size: 10,000 evaluation samples
- Columns:
query_id
,query
,positive
,negative_1
,negative_2
,negative_3
, andnegative_4
- Approximate statistics based on the first 1000 samples:
query_id query positive negative_1 negative_2 negative_3 negative_4 type int string string string string string string details - 3: ~0.20%
- 4: ~0.20%
- 5: ~0.20%
- 6: ~0.20%
- 8: ~0.20%
- 12: ~0.20%
- 14: ~0.20%
- 15: ~0.20%
- 16: ~0.20%
- 18: ~0.20%
- 19: ~0.20%
- 20: ~0.20%
- 24: ~0.20%
- 26: ~0.20%
- 31: ~0.20%
- 39: ~0.20%
- 40: ~0.20%
- 42: ~0.20%
- 43: ~0.20%
- 48: ~0.20%
- 51: ~0.20%
- 54: ~0.20%
- 55: ~0.20%
- 59: ~0.20%
- 60: ~0.20%
- 62: ~0.20%
- 63: ~0.20%
- 64: ~0.20%
- 66: ~0.20%
- 67: ~0.20%
- 68: ~0.20%
- 69: ~0.20%
- 70: ~0.20%
- 76: ~0.20%
- 77: ~0.20%
- 79: ~0.20%
- 80: ~0.20%
- 91: ~0.20%
- 100: ~0.20%
- 105: ~0.20%
- 107: ~0.20%
- 108: ~0.20%
- 111: ~0.20%
- 112: ~0.20%
- 114: ~0.20%
- 118: ~0.20%
- 121: ~0.20%
- 125: ~0.20%
- 138: ~0.20%
- 141: ~0.20%
- 142: ~0.20%
- 144: ~0.20%
- 145: ~0.20%
- 147: ~0.20%
- 148: ~0.20%
- 150: ~0.20%
- 152: ~0.20%
- 155: ~0.20%
- 160: ~0.20%
- 161: ~0.20%
- 163: ~0.20%
- 166: ~0.20%
- 168: ~0.20%
- 173: ~0.20%
- 174: ~0.20%
- 176: ~0.20%
- 177: ~0.20%
- 181: ~0.20%
- 188: ~0.20%
- 190: ~0.20%
- 193: ~0.20%
- 194: ~0.20%
- 202: ~0.20%
- 205: ~0.20%
- 212: ~0.20%
- 213: ~0.20%
- 216: ~0.20%
- 221: ~0.20%
- 226: ~0.20%
- 228: ~0.20%
- 229: ~0.20%
- 237: ~0.20%
- 238: ~0.20%
- 243: ~0.20%
- 245: ~0.20%
- 246: ~0.20%
- 249: ~0.20%
- 252: ~0.20%
- 253: ~0.20%
- 256: ~0.20%
- 257: ~0.20%
- 265: ~0.20%
- 266: ~0.20%
- 271: ~0.20%
- 275: ~0.20%
- 279: ~0.20%
- 286: ~0.20%
- 294: ~0.20%
- 295: ~0.20%
- 301: ~0.20%
- 302: ~0.20%
- 303: ~0.20%
- 306: ~0.20%
- 307: ~0.20%
- 311: ~0.20%
- 313: ~0.20%
- 315: ~0.20%
- 320: ~0.20%
- 322: ~0.20%
- 323: ~0.20%
- 328: ~0.20%
- 329: ~0.20%
- 333: ~0.20%
- 338: ~0.20%
- 345: ~0.20%
- 347: ~0.20%
- 354: ~0.20%
- 355: ~0.20%
- 360: ~0.20%
- 365: ~0.20%
- 367: ~0.20%
- 369: ~0.20%
- 371: ~0.20%
- 372: ~0.20%
- 374: ~0.20%
- 375: ~0.20%
- 376: ~0.20%
- 377: ~0.20%
- 379: ~0.20%
- 381: ~0.20%
- 382: ~0.20%
- 385: ~0.20%
- 387: ~0.20%
- 390: ~0.20%
- 393: ~0.20%
- 394: ~0.20%
- 399: ~0.20%
- 402: ~0.20%
- 405: ~0.20%
- 406: ~0.20%
- 408: ~0.20%
- 411: ~0.20%
- 414: ~0.20%
- 415: ~0.20%
- 424: ~0.20%
- 425: ~0.20%
- 426: ~0.20%
- 438: ~0.20%
- 444: ~0.20%
- 445: ~0.20%
- 447: ~0.20%
- 448: ~0.20%
- 451: ~0.20%
- 452: ~0.20%
- 454: ~0.20%
- 455: ~0.20%
- 456: ~0.20%
- 457: ~0.20%
- 458: ~0.20%
- 459: ~0.20%
- 460: ~0.20%
- 466: ~0.20%
- 469: ~0.20%
- 473: ~0.20%
- 475: ~0.20%
- 477: ~0.20%
- 478: ~0.20%
- 480: ~0.20%
- 481: ~0.20%
- 487: ~0.20%
- 489: ~0.20%
- 494: ~0.20%
- 497: ~0.20%
- 503: ~0.20%
- 504: ~0.20%
- 507: ~0.20%
- 508: ~0.20%
- 515: ~0.20%
- 516: ~0.20%
- 518: ~0.20%
- 519: ~0.20%
- 523: ~0.20%
- 524: ~0.20%
- 525: ~0.20%
- 528: ~0.20%
- 531: ~0.20%
- 534: ~0.20%
- 536: ~0.20%
- 539: ~0.20%
- 540: ~0.20%
- 542: ~0.20%
- 543: ~0.20%
- 545: ~0.20%
- 547: ~0.20%
- 576: ~0.20%
- 585: ~0.20%
- 596: ~0.20%
- 598: ~0.20%
- 601: ~0.20%
- 603: ~0.20%
- 612: ~0.20%
- 618: ~0.20%
- 623: ~0.20%
- 625: ~0.20%
- 630: ~0.20%
- 632: ~0.20%
- 641: ~0.20%
- 646: ~0.20%
- 651: ~0.20%
- 657: ~0.20%
- 662: ~0.20%
- 666: ~0.20%
- 673: ~0.20%
- 676: ~0.20%
- 677: ~0.20%
- 680: ~0.20%
- 683: ~0.20%
- 684: ~0.20%
- 685: ~0.20%
- 687: ~0.20%
- 689: ~0.20%
- 690: ~0.20%
- 691: ~0.20%
- 694: ~0.20%
- 696: ~0.20%
- 702: ~0.20%
- 704: ~0.20%
- 710: ~0.20%
- 711: ~0.20%
- 713: ~0.20%
- 731: ~0.20%
- 733: ~0.20%
- 743: ~0.20%
- 747: ~0.20%
- 748: ~0.20%
- 752: ~0.20%
- 754: ~0.20%
- 761: ~0.20%
- 766: ~0.20%
- 771: ~0.20%
- 776: ~0.20%
- 778: ~0.20%
- 790: ~0.20%
- 795: ~0.20%
- 796: ~0.20%
- 797: ~0.20%
- 798: ~0.20%
- 804: ~0.20%
- 819: ~0.20%
- 820: ~0.20%
- 821: ~0.20%
- 822: ~0.20%
- 823: ~0.20%
- 824: ~0.20%
- 825: ~0.20%
- 828: ~0.20%
- 837: ~0.20%
- 839: ~0.20%
- 842: ~0.20%
- 847: ~0.20%
- 848: ~0.20%
- 851: ~0.20%
- 855: ~0.20%
- 857: ~0.20%
- 866: ~0.20%
- 867: ~0.20%
- 872: ~0.20%
- 878: ~0.20%
- 883: ~0.20%
- 886: ~0.20%
- 898: ~0.20%
- 924: ~0.20%
- 925: ~0.20%
- 932: ~0.20%
- 942: ~0.20%
- 945: ~0.20%
- 950: ~0.20%
- 952: ~0.20%
- 958: ~0.20%
- 964: ~0.20%
- 968: ~0.20%
- 970: ~0.20%
- 976: ~0.20%
- 982: ~0.20%
- 987: ~0.20%
- 990: ~0.20%
- 997: ~0.20%
- 1004: ~0.20%
- 1006: ~0.20%
- 1009: ~0.20%
- 1012: ~0.20%
- 1024: ~0.20%
- 1028: ~0.20%
- 1032: ~0.20%
- 1033: ~0.20%
- 1035: ~0.20%
- 1040: ~0.20%
- 1041: ~0.20%
- 1049: ~0.20%
- 1070: ~0.20%
- 1078: ~0.20%
- 1085: ~0.20%
- 1096: ~0.20%
- 1102: ~0.20%
- 1114: ~0.20%
- 1115: ~0.20%
- 1117: ~0.20%
- 1118: ~0.20%
- 1119: ~0.20%
- 1120: ~0.20%
- 1127: ~0.20%
- 1130: ~0.20%
- 1138: ~0.20%
- 1139: ~0.20%
- 1150: ~0.20%
- 1161: ~0.20%
- 1163: ~0.20%
- 1168: ~0.20%
- 1180: ~0.20%
- 1183: ~0.20%
- 1189: ~0.20%
- 1192: ~0.20%
- 1199: ~0.20%
- 1200: ~0.20%
- 1201: ~0.20%
- 1204: ~0.20%
- 1211: ~0.20%
- 1216: ~0.20%
- 1217: ~0.20%
- 1220: ~0.20%
- 1228: ~0.20%
- 1230: ~0.20%
- 1234: ~0.20%
- 1237: ~0.20%
- 1241: ~0.20%
- 1245: ~0.20%
- 1247: ~0.20%
- 1251: ~0.20%
- 1256: ~0.20%
- 1257: ~0.20%
- 1261: ~0.20%
- 1264: ~0.20%
- 1266: ~0.20%
- 1269: ~0.20%
- 1270: ~0.20%
- 1271: ~0.20%
- 1272: ~0.20%
- 1275: ~0.20%
- 1281: ~0.20%
- 1285: ~0.20%
- 1292: ~0.20%
- 1293: ~0.20%
- 1299: ~0.20%
- 1300: ~0.20%
- 1305: ~0.20%
- 1308: ~0.20%
- 1323: ~0.20%
- 1327: ~0.20%
- 1333: ~0.20%
- 1335: ~0.20%
- 1336: ~0.20%
- 1365: ~0.20%
- 1367: ~0.20%
- 1369: ~0.20%
- 1370: ~0.20%
- 1394: ~0.20%
- 1398: ~0.20%
- 1401: ~0.20%
- 1404: ~0.20%
- 1405: ~0.20%
- 1407: ~0.20%
- 1423: ~0.20%
- 1426: ~0.20%
- 1430: ~0.20%
- 1432: ~0.20%
- 1441: ~0.20%
- 1448: ~0.20%
- 1453: ~0.20%
- 1463: ~0.20%
- 1465: ~0.20%
- 1466: ~0.20%
- 1471: ~0.20%
- 1474: ~0.20%
- 1480: ~0.20%
- 1481: ~0.20%
- 1485: ~0.20%
- 1487: ~0.20%
- 1493: ~0.20%
- 1494: ~0.20%
- 1498: ~0.20%
- 1500: ~0.20%
- 1502: ~0.20%
- 1515: ~0.20%
- 1519: ~0.20%
- 1520: ~0.20%
- 1521: ~0.20%
- 1523: ~0.20%
- 1525: ~0.20%
- 1527: ~0.20%
- 1528: ~0.20%
- 1530: ~0.20%
- 1551: ~0.20%
- 1554: ~0.20%
- 1556: ~0.20%
- 1562: ~0.20%
- 1563: ~0.20%
- 1564: ~0.20%
- 1567: ~0.20%
- 1568: ~0.20%
- 1569: ~0.20%
- 1578: ~0.20%
- 1588: ~0.20%
- 1599: ~0.20%
- 1604: ~0.20%
- 1610: ~0.20%
- 1620: ~0.20%
- 1621: ~0.20%
- 1626: ~0.20%
- 1628: ~0.20%
- 1631: ~0.20%
- 1632: ~0.20%
- 1638: ~0.20%
- 1642: ~0.20%
- 1643: ~0.20%
- 1646: ~0.20%
- 1650: ~0.20%
- 1662: ~0.20%
- 1663: ~0.20%
- 1665: ~0.20%
- 1668: ~0.20%
- 1673: ~0.20%
- 1675: ~0.20%
- 1679: ~0.20%
- 1689: ~0.20%
- 1694: ~0.20%
- 1702: ~0.20%
- 1705: ~0.20%
- 1710: ~0.20%
- 1719: ~0.20%
- 1720: ~0.20%
- 1722: ~0.20%
- 1723: ~0.20%
- 1727: ~0.20%
- 1729: ~0.20%
- 1735: ~0.20%
- 1738: ~0.20%
- 1739: ~0.20%
- 1743: ~0.20%
- 1745: ~0.20%
- 1746: ~0.20%
- 1747: ~0.20%
- 1749: ~0.20%
- 1752: ~0.20%
- 1754: ~0.20%
- 1756: ~0.20%
- 1757: ~0.20%
- 1759: ~0.20%
- 1760: ~0.20%
- 1763: ~0.20%
- 1765: ~0.20%
- 1767: ~0.20%
- 1768: ~0.20%
- 1769: ~0.20%
- 1770: ~0.20%
- 1772: ~0.20%
- 1775: ~0.20%
- 1779: ~0.20%
- 1781: ~0.20%
- 1782: ~0.20%
- 1784: ~0.20%
- 1787: ~0.20%
- 1790: ~0.20%
- 1791: ~0.20%
- 1792: ~0.20%
- 1793: ~0.20%
- 1795: ~0.20%
- 1798: ~0.20%
- 1799: ~0.20%
- 1800: ~0.20%
- 1801: ~0.20%
- 1802: ~0.20%
- 1804: ~0.20%
- 1806: ~0.20%
- 1807: ~0.20%
- 1808: ~0.20%
- 1809: ~0.20%
- 1810: ~0.20%
- 1813: ~0.20%
- 1814: ~0.20%
- 1815: ~0.20%
- 1816: ~0.20%
- 1817: ~0.20%
- 1818: ~0.20%
- 1822: ~0.20%
- 1825: ~0.20%
- 1828: ~0.20%
- 1829: ~0.20%
- 1830: ~0.20%
- 1832: ~0.20%
- 1833: ~0.20%
- min: 5 tokens
- mean: 10.9 tokens
- max: 27 tokens
- min: 17 tokens
- mean: 31.89 tokens
- max: 32 tokens
- min: 26 tokens
- mean: 31.96 tokens
- max: 32 tokens
- min: 17 tokens
- mean: 31.93 tokens
- max: 32 tokens
- min: 15 tokens
- mean: 31.91 tokens
- max: 32 tokens
- min: 17 tokens
- mean: 31.9 tokens
- max: 32 tokens
- Samples:
query_id query positive negative_1 negative_2 negative_3 negative_4 3
Another name for the primary visual cortex is
The primary (parts of the cortex that receive sensory inputs from the thalamus) visual cortex is also known as V1, V isual area one, and the striate cortex.The extrastriate areas consist of visual areas two (V2), three (V3), four (V4), and five (V5).he primary visual cortex is the best-studied visual area in the brain. In all mammals studied, it is located in the posterior pole of the occipital cortex (the occipital cortex is responsible for processing visual stimuli).
The visual cortex is made up of Brodmann area 17 (the primary visual cortex), and Brodmann areas 18 and 19, the extrastriate cortical areas.he primary visual cortex is the best-studied visual area in the brain. In all mammals studied, it is located in the posterior pole of the occipital cortex (the occipital cortex is responsible for processing visual stimuli).
The visual cortex of the brain is the part of the cerebral cortex responsible for processing visual information. This article addresses the ventral/dorsal model of the visual cortex. Another model for the perceptual/conceptual neuropsychological model of the visual cortex was studied by Raftopolous.he primary visual cortex is the best-studied visual area in the brain. In all mammals studied, it is located in the posterior pole of the occipital cortex (the occipital cortex is responsible for processing visual stimuli).
The primary visual cortex is the best-studied visual area in the brain. In all mammals studied, it is located in the posterior pole of the occipital cortex (the occipital cortex is responsible for processing visual stimuli).he primary visual cortex is the best-studied visual area in the brain. In all mammals studied, it is located in the posterior pole of the occipital cortex (the occipital cortex is responsible for processing visual stimuli).
Damage to the primary visual cortex, which is located on the surface of the posterior occipital lobe, can cause blindness due to the holes in the visual map on the surface of the visual cortex that resulted from the lesions. significant functional aspect of the occipital lobe is that it contains the primary visual cortex. Retinal sensors convey stimuli through the optic tracts to the lateral geniculate bodies, where optic radiations continue to the visual cortex.
3
Another name for the primary visual cortex is
The primary (parts of the cortex that receive sensory inputs from the thalamus) visual cortex is also known as V1, V isual area one, and the striate cortex.The extrastriate areas consist of visual areas two (V2), three (V3), four (V4), and five (V5).he primary visual cortex is the best-studied visual area in the brain. In all mammals studied, it is located in the posterior pole of the occipital cortex (the occipital cortex is responsible for processing visual stimuli).
Called also first visual area. visual cortex the area of the occipital lobe of the cerebral cortex concerned with vision; the striate cortex is also called the first visual area, and the adjacent second and third visual areas serve as its association areas.rea 17 (which is also called the striate cortex or area because the line of Gennari is grossly visible on its surface) is the primary visual cortex, receiving the visual radiation from the lateral geniculate body of the thalamus.
The primary visual cortex, V1, is the koniocortex (sensory type) located in and around the calcarine fissure in the occipital lobe. It is the one that receives information directly from the lateral geniculate nucleus .To this have been added later as many as thirty interconnected (secondary or tertiary) visual areas.esearch on the primary visual cortex can involve recording action potentials from electrodes within the brain of cats, ferrets, mice, or monkeys, or through recording intrinsic optical signals from animals or fMRI signals from human and monkey V1.
However, only in the primary visual cortex (V1) can this band be seen with the naked eye as the line of Gennari (#6282, #8987). In fact, the term striate (striped) cortex is another name for primary visual cortex.Primary visual cortex is also referred to as either calcarine cortex or Brodmann’s area 17.djacent to primary visual cortex (V1) are the visual association cortical areas, including V2 and V3 (= area 18, or 18 & 19, depending on the author) (#4350). Lesions of V1, V2 and V3 produce identical visual field defects. Beyond V3, visual information is processed along two functionally different pathways.
Primary visual cortex (V1) Edit. The primary visual cortex is the best studied visual area in the brain. Like that of all mammals studied, it is located in the posterior pole of the occipital cortex (the occipital cortex is responsible for processing visual stimuli).It is the simplest, earliest cortical visual area.esearch on the primary visual cortex can involve recording action potentials from electrodes within the brain of cats, ferrets, mice, or monkeys, or through recording intrinsic optical signals from animals or fMRI signals from human and monkey V1.
4
Defining alcoholism as a disease is associated with Jellinek
The formation of AA – Alcoholics Anonymous – in the 1930s and the publication of noted psychiatrist and Director of the Center of Alcohol Studies at Yale Medical School E. M. Jellinek’s famous book defining the concept of alcoholism as a medical disease facilitated moving alcoholism into a different light.s alcoholism is an addiction, it is considered a disease of the brain. The brain has been physically altered by extended exposure to alcohol, causing it to function differently and therefore creating addictive behavior.
Nonetheless, it was Jellinek's Stages of the Alcoholism that led to diagnosing alcoholism as a disease and eventually to the medical acceptance of alcoholism as a disease. Astoundingly, the inception of the disease theory and treatment for substance abuse is based on fraud.n a recent Gallup poll, 90 percent of people surveyed believe that alcoholism is a disease. Most argue that because the American Medical Association (AMA) has proclaimed alcoholism a disease, the idea is without reproach. But, the fact is that the AMA made this determination in the absence of empirical evidence.
1 JELLINEK PHASES: THE PROGRESSIVE SYMPTOMS OF ALCOHOLISM The behavioral characteristics of the alcoholic are progressive as is the person's tolerance to alcohol and as is the course of the disease itself.ypes of alcoholism: Jellinek's species The pattern described above refers to the stages of alcohol addiction. Jellinek continued his study of alcoholism, focusing on alcohol problems in other countries.
1 Jellinek, E. M., Phases in the Drinking History of Alcoholics: Analysis of a Survey Conducted by the Official Organ of Alcoholics Anonymous, Quarterly Journal of Studies on Alcohol, Vol.7, (1946), pp. 2 1–88. 3 Jellinek, E. M., The Disease Concept of Alcoholism, Hillhouse, (New Haven), 1960.uring the 1920s, he conducted research in Sierra Leone and at Tela, Honduras. In the 1930s he returned to the U.S.A. and worked at the Worcester State Hospital, Worcester, Massachusetts, from whence he was commissioned to conduct a study for the Research Council on Problems of Alcohol.
ALCOHOLISM: A DISEASE. In 1956 the American Medical Association decided that alcoholism is a disease, however more than 30 years later this is still debated in certain circles.Besides the medical opinion, there are many others (e.g., legal, sociological, religious) which derive from any number of social pressures.LCOHOLISM: A DISEASE. In 1956 the American Medical Association decided that alcoholism is a disease, however more than 30 years later this is still debated in certain circles.
- Loss:
pylate.losses.contrastive.Contrastive
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: epochper_device_train_batch_size
: 64per_device_eval_batch_size
: 64learning_rate
: 1e-05lr_scheduler_type
: cosinefp16
: True
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: epochprediction_loss_only
: Trueper_device_train_batch_size
: 64per_device_eval_batch_size
: 64per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 1e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 3max_steps
: -1lr_scheduler_type
: cosinelr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Epoch | Step | Training Loss | Validation Loss | accuracy |
---|---|---|---|---|
1.0 | 7813 | 1.0809 | - | - |
0 | 0 | - | - | 0.7947 |
1.0 | 7813 | - | 1.2037 | - |
2.0 | 15626 | 0.8982 | - | - |
0 | 0 | - | - | 0.7982 |
2.0 | 15626 | - | 1.1963 | - |
0 | 0 | - | - | 0.7992 |
Framework Versions
- Python: 3.11.12
- Sentence Transformers: 4.0.2
- PyLate: 1.2.0
- Transformers: 4.48.2
- PyTorch: 2.6.0+cu124
- Accelerate: 1.7.0
- Datasets: 3.6.0
- Tokenizers: 0.21.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084"
}
PyLate
@misc{PyLate,
title={PyLate: Flexible Training and Retrieval for Late Interaction Models},
author={Chaffin, Antoine and Sourty, Raphaël},
url={https://github.com/lightonai/pylate},
year={2024}
}
- Downloads last month
- 23
Model tree for yosefw/colbert-distilroberta-base
Base model
distilbert/distilroberta-base