SentenceTransformer based on intfloat/multilingual-e5-small
This is a sentence-transformers model finetuned from intfloat/multilingual-e5-small. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: intfloat/multilingual-e5-small
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 384 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("CloudlessSky/fullname_encoder_v1")
# Run inference
sentences = [
'ромазанов хусин алеевич',
'роиазанов хусир алеевич',
'морозов тимофей васильевич',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 2,000,000 training samples
- Columns:
sentence1
,sentence2
, andlabel
- Approximate statistics based on the first 1000 samples:
sentence1 sentence2 label type string string int details - min: 6 tokens
- mean: 10.9 tokens
- max: 19 tokens
- min: 6 tokens
- mean: 11.86 tokens
- max: 27 tokens
- 0: ~48.70%
- 1: ~51.30%
- Samples:
sentence1 sentence2 label лебедев александр арсентьевич
лебедев александр арсеньевич
0
кирюхин сергей никитович
мухин сергей никитович
0
додонов иван сидорович
сидоров иван спиридонович
0
- Loss:
ContrastiveLoss
with these parameters:{ "distance_metric": "SiameseDistanceMetric.COSINE_DISTANCE", "margin": 0.5, "size_average": true }
Evaluation Dataset
Unnamed Dataset
- Size: 663,132 evaluation samples
- Columns:
sentence1
,sentence2
, andlabel
- Approximate statistics based on the first 1000 samples:
sentence1 sentence2 label type string string int details - min: 5 tokens
- mean: 10.92 tokens
- max: 22 tokens
- min: 2 tokens
- mean: 11.77 tokens
- max: 24 tokens
- 0: ~50.20%
- 1: ~49.80%
- Samples:
sentence1 sentence2 label иванисько ульян иванович
ульян иванисько иванович
1
топычканов иван александрович
кабанов иван александрович
0
джавадов камал джавад оглы
джавадов джавад камал оглы
1
- Loss:
ContrastiveLoss
with these parameters:{ "distance_metric": "SiameseDistanceMetric.COSINE_DISTANCE", "margin": 0.5, "size_average": true }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 64per_device_eval_batch_size
: 128num_train_epochs
: 8
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 64per_device_eval_batch_size
: 128per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 8max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}tp_size
: 0fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss | Validation Loss |
---|---|---|---|
0.016 | 500 | 0.0143 | - |
0.032 | 1000 | 0.0119 | - |
0.048 | 1500 | 0.0112 | - |
0.064 | 2000 | 0.0108 | - |
0.08 | 2500 | 0.0105 | - |
0.096 | 3000 | 0.0098 | - |
0.112 | 3500 | 0.0101 | - |
0.128 | 4000 | 0.0096 | - |
0.144 | 4500 | 0.0096 | - |
0.16 | 5000 | 0.0093 | - |
0.176 | 5500 | 0.0093 | - |
0.192 | 6000 | 0.0089 | - |
0.208 | 6500 | 0.0087 | - |
0.224 | 7000 | 0.0086 | - |
0.24 | 7500 | 0.0084 | - |
0.256 | 8000 | 0.0083 | - |
0.272 | 8500 | 0.0082 | - |
0.288 | 9000 | 0.008 | - |
0.304 | 9500 | 0.0079 | - |
0.32 | 10000 | 0.008 | 0.0055 |
0.336 | 10500 | 0.0077 | - |
0.352 | 11000 | 0.0077 | - |
0.368 | 11500 | 0.0076 | - |
0.384 | 12000 | 0.0073 | - |
0.4 | 12500 | 0.0074 | - |
0.416 | 13000 | 0.0074 | - |
0.432 | 13500 | 0.0074 | - |
0.448 | 14000 | 0.0075 | - |
0.464 | 14500 | 0.0072 | - |
0.48 | 15000 | 0.007 | - |
0.496 | 15500 | 0.007 | - |
0.512 | 16000 | 0.0069 | - |
0.528 | 16500 | 0.0071 | - |
0.544 | 17000 | 0.0067 | - |
0.56 | 17500 | 0.007 | - |
0.576 | 18000 | 0.0068 | - |
0.592 | 18500 | 0.0068 | - |
0.608 | 19000 | 0.0069 | - |
0.624 | 19500 | 0.0067 | - |
0.64 | 20000 | 0.0067 | 0.0044 |
0.656 | 20500 | 0.0065 | - |
0.672 | 21000 | 0.0064 | - |
0.688 | 21500 | 0.0065 | - |
0.704 | 22000 | 0.0065 | - |
0.72 | 22500 | 0.0064 | - |
0.736 | 23000 | 0.0064 | - |
0.752 | 23500 | 0.0063 | - |
0.768 | 24000 | 0.0064 | - |
0.784 | 24500 | 0.0063 | - |
0.8 | 25000 | 0.0063 | - |
0.816 | 25500 | 0.0062 | - |
0.832 | 26000 | 0.0063 | - |
0.848 | 26500 | 0.0062 | - |
0.864 | 27000 | 0.006 | - |
0.88 | 27500 | 0.006 | - |
0.896 | 28000 | 0.006 | - |
0.912 | 28500 | 0.0061 | - |
0.928 | 29000 | 0.0061 | - |
0.944 | 29500 | 0.0059 | - |
0.96 | 30000 | 0.006 | 0.0039 |
0.976 | 30500 | 0.0059 | - |
0.992 | 31000 | 0.0059 | - |
1.008 | 31500 | 0.0057 | - |
1.024 | 32000 | 0.0056 | - |
1.04 | 32500 | 0.0056 | - |
1.056 | 33000 | 0.0056 | - |
1.072 | 33500 | 0.0056 | - |
1.088 | 34000 | 0.0056 | - |
1.104 | 34500 | 0.0054 | - |
1.12 | 35000 | 0.0056 | - |
1.1360 | 35500 | 0.0055 | - |
1.152 | 36000 | 0.0053 | - |
1.168 | 36500 | 0.0055 | - |
1.184 | 37000 | 0.0054 | - |
1.2 | 37500 | 0.0056 | - |
1.216 | 38000 | 0.0054 | - |
1.232 | 38500 | 0.0053 | - |
1.248 | 39000 | 0.0055 | - |
1.264 | 39500 | 0.0054 | - |
1.28 | 40000 | 0.0055 | 0.0037 |
1.296 | 40500 | 0.0053 | - |
1.312 | 41000 | 0.0052 | - |
1.328 | 41500 | 0.0052 | - |
1.3440 | 42000 | 0.0054 | - |
1.3600 | 42500 | 0.0055 | - |
1.376 | 43000 | 0.0053 | - |
1.392 | 43500 | 0.0054 | - |
1.408 | 44000 | 0.0053 | - |
1.424 | 44500 | 0.0053 | - |
1.44 | 45000 | 0.0053 | - |
1.456 | 45500 | 0.0053 | - |
1.472 | 46000 | 0.0051 | - |
1.488 | 46500 | 0.0053 | - |
1.504 | 47000 | 0.0052 | - |
1.52 | 47500 | 0.0052 | - |
1.536 | 48000 | 0.0052 | - |
1.552 | 48500 | 0.005 | - |
1.568 | 49000 | 0.005 | - |
1.584 | 49500 | 0.0052 | - |
1.6 | 50000 | 0.0053 | 0.0036 |
1.616 | 50500 | 0.0052 | - |
1.6320 | 51000 | 0.0052 | - |
1.6480 | 51500 | 0.005 | - |
1.6640 | 52000 | 0.0051 | - |
1.6800 | 52500 | 0.005 | - |
1.696 | 53000 | 0.0051 | - |
1.712 | 53500 | 0.0051 | - |
1.728 | 54000 | 0.005 | - |
1.744 | 54500 | 0.0049 | - |
1.76 | 55000 | 0.0049 | - |
1.776 | 55500 | 0.0049 | - |
1.792 | 56000 | 0.0051 | - |
1.808 | 56500 | 0.0049 | - |
1.8240 | 57000 | 0.0049 | - |
1.8400 | 57500 | 0.0051 | - |
1.8560 | 58000 | 0.0049 | - |
1.8720 | 58500 | 0.005 | - |
1.888 | 59000 | 0.0049 | - |
1.904 | 59500 | 0.0049 | - |
1.92 | 60000 | 0.0048 | 0.0034 |
1.936 | 60500 | 0.005 | - |
1.952 | 61000 | 0.0048 | - |
1.968 | 61500 | 0.0048 | - |
1.984 | 62000 | 0.0049 | - |
2.0 | 62500 | 0.0049 | - |
2.016 | 63000 | 0.0046 | - |
2.032 | 63500 | 0.0045 | - |
2.048 | 64000 | 0.0045 | - |
2.064 | 64500 | 0.0046 | - |
2.08 | 65000 | 0.0044 | - |
2.096 | 65500 | 0.0046 | - |
2.112 | 66000 | 0.0045 | - |
2.128 | 66500 | 0.0046 | - |
2.144 | 67000 | 0.0045 | - |
2.16 | 67500 | 0.0044 | - |
2.176 | 68000 | 0.0045 | - |
2.192 | 68500 | 0.0046 | - |
2.208 | 69000 | 0.0046 | - |
2.224 | 69500 | 0.0045 | - |
2.24 | 70000 | 0.0046 | 0.0033 |
2.2560 | 70500 | 0.0045 | - |
2.2720 | 71000 | 0.0045 | - |
2.288 | 71500 | 0.0045 | - |
2.304 | 72000 | 0.0045 | - |
2.32 | 72500 | 0.0045 | - |
2.336 | 73000 | 0.0045 | - |
2.352 | 73500 | 0.0046 | - |
2.368 | 74000 | 0.0045 | - |
2.384 | 74500 | 0.0045 | - |
2.4 | 75000 | 0.0044 | - |
2.416 | 75500 | 0.0044 | - |
2.432 | 76000 | 0.0045 | - |
2.448 | 76500 | 0.0045 | - |
2.464 | 77000 | 0.0045 | - |
2.48 | 77500 | 0.0045 | - |
2.496 | 78000 | 0.0044 | - |
2.512 | 78500 | 0.0044 | - |
2.528 | 79000 | 0.0044 | - |
2.544 | 79500 | 0.0046 | - |
2.56 | 80000 | 0.0045 | 0.0032 |
2.576 | 80500 | 0.0045 | - |
2.592 | 81000 | 0.0044 | - |
2.608 | 81500 | 0.0043 | - |
2.624 | 82000 | 0.0045 | - |
2.64 | 82500 | 0.0043 | - |
2.656 | 83000 | 0.0044 | - |
2.672 | 83500 | 0.0043 | - |
2.6880 | 84000 | 0.0043 | - |
2.7040 | 84500 | 0.0043 | - |
2.7200 | 85000 | 0.0044 | - |
2.7360 | 85500 | 0.0044 | - |
2.752 | 86000 | 0.0044 | - |
2.768 | 86500 | 0.0044 | - |
2.784 | 87000 | 0.0043 | - |
2.8 | 87500 | 0.0043 | - |
2.816 | 88000 | 0.0042 | - |
2.832 | 88500 | 0.0044 | - |
2.848 | 89000 | 0.0044 | - |
2.864 | 89500 | 0.0044 | - |
2.88 | 90000 | 0.0043 | 0.0031 |
2.896 | 90500 | 0.0043 | - |
2.912 | 91000 | 0.0044 | - |
2.928 | 91500 | 0.0043 | - |
2.944 | 92000 | 0.0043 | - |
2.96 | 92500 | 0.0042 | - |
2.976 | 93000 | 0.0042 | - |
2.992 | 93500 | 0.0042 | - |
3.008 | 94000 | 0.0041 | - |
3.024 | 94500 | 0.0038 | - |
3.04 | 95000 | 0.004 | - |
3.056 | 95500 | 0.0038 | - |
3.072 | 96000 | 0.0039 | - |
3.088 | 96500 | 0.0039 | - |
3.104 | 97000 | 0.0039 | - |
3.12 | 97500 | 0.0039 | - |
3.136 | 98000 | 0.0039 | - |
3.152 | 98500 | 0.0038 | - |
3.168 | 99000 | 0.004 | - |
3.184 | 99500 | 0.004 | - |
3.2 | 100000 | 0.004 | 0.0031 |
3.216 | 100500 | 0.0039 | - |
3.232 | 101000 | 0.0038 | - |
3.248 | 101500 | 0.004 | - |
3.2640 | 102000 | 0.0039 | - |
3.2800 | 102500 | 0.0041 | - |
3.296 | 103000 | 0.004 | - |
3.312 | 103500 | 0.0039 | - |
3.328 | 104000 | 0.0039 | - |
3.344 | 104500 | 0.004 | - |
3.36 | 105000 | 0.004 | - |
3.376 | 105500 | 0.004 | - |
3.392 | 106000 | 0.0041 | - |
3.408 | 106500 | 0.004 | - |
3.424 | 107000 | 0.0039 | - |
3.44 | 107500 | 0.0039 | - |
3.456 | 108000 | 0.004 | - |
3.472 | 108500 | 0.0039 | - |
3.488 | 109000 | 0.0038 | - |
3.504 | 109500 | 0.0039 | - |
3.52 | 110000 | 0.0039 | 0.0030 |
3.536 | 110500 | 0.0041 | - |
3.552 | 111000 | 0.0039 | - |
3.568 | 111500 | 0.0041 | - |
3.584 | 112000 | 0.0038 | - |
3.6 | 112500 | 0.0038 | - |
3.616 | 113000 | 0.0039 | - |
3.632 | 113500 | 0.0038 | - |
3.648 | 114000 | 0.0039 | - |
3.664 | 114500 | 0.0038 | - |
3.68 | 115000 | 0.0038 | - |
3.6960 | 115500 | 0.004 | - |
3.7120 | 116000 | 0.0038 | - |
3.7280 | 116500 | 0.0039 | - |
3.7440 | 117000 | 0.0039 | - |
3.76 | 117500 | 0.0038 | - |
3.776 | 118000 | 0.0039 | - |
3.792 | 118500 | 0.0039 | - |
3.808 | 119000 | 0.0038 | - |
3.824 | 119500 | 0.0039 | - |
3.84 | 120000 | 0.0039 | 0.0029 |
3.856 | 120500 | 0.0039 | - |
3.872 | 121000 | 0.0039 | - |
3.888 | 121500 | 0.0037 | - |
3.904 | 122000 | 0.0038 | - |
3.92 | 122500 | 0.0038 | - |
3.936 | 123000 | 0.0038 | - |
3.952 | 123500 | 0.0039 | - |
3.968 | 124000 | 0.0038 | - |
3.984 | 124500 | 0.0039 | - |
4.0 | 125000 | 0.0039 | - |
4.016 | 125500 | 0.0034 | - |
4.032 | 126000 | 0.0035 | - |
4.048 | 126500 | 0.0036 | - |
4.064 | 127000 | 0.0035 | - |
4.08 | 127500 | 0.0035 | - |
4.096 | 128000 | 0.0035 | - |
4.112 | 128500 | 0.0035 | - |
4.128 | 129000 | 0.0036 | - |
4.144 | 129500 | 0.0035 | - |
4.16 | 130000 | 0.0035 | 0.0029 |
4.176 | 130500 | 0.0035 | - |
4.192 | 131000 | 0.0035 | - |
4.208 | 131500 | 0.0035 | - |
4.224 | 132000 | 0.0036 | - |
4.24 | 132500 | 0.0036 | - |
4.256 | 133000 | 0.0036 | - |
4.272 | 133500 | 0.0035 | - |
4.288 | 134000 | 0.0034 | - |
4.304 | 134500 | 0.0036 | - |
4.32 | 135000 | 0.0035 | - |
4.336 | 135500 | 0.0036 | - |
4.352 | 136000 | 0.0036 | - |
4.368 | 136500 | 0.0035 | - |
4.384 | 137000 | 0.0036 | - |
4.4 | 137500 | 0.0035 | - |
4.416 | 138000 | 0.0034 | - |
4.432 | 138500 | 0.0034 | - |
4.448 | 139000 | 0.0034 | - |
4.464 | 139500 | 0.0035 | - |
4.48 | 140000 | 0.0035 | 0.0029 |
4.496 | 140500 | 0.0034 | - |
4.5120 | 141000 | 0.0035 | - |
4.5280 | 141500 | 0.0035 | - |
4.5440 | 142000 | 0.0036 | - |
4.5600 | 142500 | 0.0035 | - |
4.576 | 143000 | 0.0034 | - |
4.592 | 143500 | 0.0034 | - |
4.608 | 144000 | 0.0035 | - |
4.624 | 144500 | 0.0035 | - |
4.64 | 145000 | 0.0036 | - |
4.656 | 145500 | 0.0036 | - |
4.672 | 146000 | 0.0035 | - |
4.688 | 146500 | 0.0035 | - |
4.704 | 147000 | 0.0033 | - |
4.72 | 147500 | 0.0035 | - |
4.736 | 148000 | 0.0035 | - |
4.752 | 148500 | 0.0036 | - |
4.768 | 149000 | 0.0036 | - |
4.784 | 149500 | 0.0035 | - |
4.8 | 150000 | 0.0035 | 0.0028 |
4.816 | 150500 | 0.0035 | - |
4.832 | 151000 | 0.0035 | - |
4.848 | 151500 | 0.0035 | - |
4.864 | 152000 | 0.0036 | - |
4.88 | 152500 | 0.0036 | - |
4.896 | 153000 | 0.0035 | - |
4.912 | 153500 | 0.0035 | - |
4.928 | 154000 | 0.0035 | - |
4.944 | 154500 | 0.0035 | - |
4.96 | 155000 | 0.0035 | - |
4.976 | 155500 | 0.0035 | - |
4.992 | 156000 | 0.0034 | - |
5.008 | 156500 | 0.0033 | - |
5.024 | 157000 | 0.0032 | - |
5.04 | 157500 | 0.0032 | - |
5.056 | 158000 | 0.0033 | - |
5.072 | 158500 | 0.0032 | - |
5.088 | 159000 | 0.0032 | - |
5.104 | 159500 | 0.0031 | - |
5.12 | 160000 | 0.0032 | 0.0028 |
5.136 | 160500 | 0.0032 | - |
5.152 | 161000 | 0.0032 | - |
5.168 | 161500 | 0.0033 | - |
5.184 | 162000 | 0.0033 | - |
5.2 | 162500 | 0.0031 | - |
5.216 | 163000 | 0.0033 | - |
5.232 | 163500 | 0.0032 | - |
5.248 | 164000 | 0.0032 | - |
5.264 | 164500 | 0.0032 | - |
5.28 | 165000 | 0.0033 | - |
5.296 | 165500 | 0.0033 | - |
5.312 | 166000 | 0.0031 | - |
5.328 | 166500 | 0.0032 | - |
5.344 | 167000 | 0.0032 | - |
5.36 | 167500 | 0.0033 | - |
5.376 | 168000 | 0.0033 | - |
5.392 | 168500 | 0.0032 | - |
5.408 | 169000 | 0.0032 | - |
5.424 | 169500 | 0.0032 | - |
5.44 | 170000 | 0.0032 | 0.0027 |
5.456 | 170500 | 0.0031 | - |
5.4720 | 171000 | 0.0031 | - |
5.4880 | 171500 | 0.0032 | - |
5.504 | 172000 | 0.0031 | - |
5.52 | 172500 | 0.0031 | - |
5.536 | 173000 | 0.0032 | - |
5.552 | 173500 | 0.0031 | - |
5.568 | 174000 | 0.0032 | - |
5.584 | 174500 | 0.0032 | - |
5.6 | 175000 | 0.0032 | - |
5.616 | 175500 | 0.0032 | - |
5.632 | 176000 | 0.0032 | - |
5.648 | 176500 | 0.0032 | - |
5.664 | 177000 | 0.0032 | - |
5.68 | 177500 | 0.0032 | - |
5.696 | 178000 | 0.0032 | - |
5.712 | 178500 | 0.0033 | - |
5.728 | 179000 | 0.0032 | - |
5.744 | 179500 | 0.0031 | - |
5.76 | 180000 | 0.0033 | 0.0027 |
5.776 | 180500 | 0.0033 | - |
5.792 | 181000 | 0.003 | - |
5.808 | 181500 | 0.0032 | - |
5.824 | 182000 | 0.0032 | - |
5.84 | 182500 | 0.0032 | - |
5.856 | 183000 | 0.0032 | - |
5.872 | 183500 | 0.0033 | - |
5.888 | 184000 | 0.0032 | - |
5.904 | 184500 | 0.0032 | - |
5.92 | 185000 | 0.0032 | - |
5.936 | 185500 | 0.0031 | - |
5.952 | 186000 | 0.0031 | - |
5.968 | 186500 | 0.0031 | - |
5.984 | 187000 | 0.0033 | - |
6.0 | 187500 | 0.0031 | - |
6.016 | 188000 | 0.0028 | - |
6.032 | 188500 | 0.0029 | - |
6.048 | 189000 | 0.003 | - |
6.064 | 189500 | 0.003 | - |
6.08 | 190000 | 0.0029 | 0.0027 |
6.096 | 190500 | 0.0029 | - |
6.112 | 191000 | 0.0029 | - |
6.128 | 191500 | 0.003 | - |
6.144 | 192000 | 0.0029 | - |
6.16 | 192500 | 0.003 | - |
6.176 | 193000 | 0.003 | - |
6.192 | 193500 | 0.0029 | - |
6.208 | 194000 | 0.0029 | - |
6.224 | 194500 | 0.0029 | - |
6.24 | 195000 | 0.003 | - |
6.256 | 195500 | 0.0029 | - |
6.272 | 196000 | 0.0029 | - |
6.288 | 196500 | 0.0029 | - |
6.304 | 197000 | 0.0029 | - |
6.32 | 197500 | 0.003 | - |
6.336 | 198000 | 0.0029 | - |
6.352 | 198500 | 0.0029 | - |
6.368 | 199000 | 0.003 | - |
6.384 | 199500 | 0.0029 | - |
6.4 | 200000 | 0.0029 | 0.0028 |
6.416 | 200500 | 0.0029 | - |
6.432 | 201000 | 0.0029 | - |
6.448 | 201500 | 0.0031 | - |
6.464 | 202000 | 0.0029 | - |
6.48 | 202500 | 0.003 | - |
6.496 | 203000 | 0.003 | - |
6.5120 | 203500 | 0.003 | - |
6.5280 | 204000 | 0.0029 | - |
6.5440 | 204500 | 0.003 | - |
6.5600 | 205000 | 0.0029 | - |
6.576 | 205500 | 0.0028 | - |
6.592 | 206000 | 0.003 | - |
6.608 | 206500 | 0.0029 | - |
6.624 | 207000 | 0.003 | - |
6.64 | 207500 | 0.003 | - |
6.656 | 208000 | 0.003 | - |
6.672 | 208500 | 0.0029 | - |
6.688 | 209000 | 0.003 | - |
6.704 | 209500 | 0.003 | - |
6.72 | 210000 | 0.0029 | 0.0027 |
6.736 | 210500 | 0.0029 | - |
6.752 | 211000 | 0.0029 | - |
6.768 | 211500 | 0.0029 | - |
6.784 | 212000 | 0.0029 | - |
6.8 | 212500 | 0.0029 | - |
6.816 | 213000 | 0.003 | - |
6.832 | 213500 | 0.0028 | - |
6.848 | 214000 | 0.003 | - |
6.864 | 214500 | 0.0029 | - |
6.88 | 215000 | 0.0029 | - |
6.896 | 215500 | 0.0029 | - |
6.912 | 216000 | 0.0029 | - |
6.928 | 216500 | 0.0029 | - |
6.944 | 217000 | 0.0028 | - |
6.96 | 217500 | 0.003 | - |
6.976 | 218000 | 0.003 | - |
6.992 | 218500 | 0.0029 | - |
7.008 | 219000 | 0.0028 | - |
7.024 | 219500 | 0.0028 | - |
7.04 | 220000 | 0.0028 | 0.0027 |
7.056 | 220500 | 0.0027 | - |
7.072 | 221000 | 0.0027 | - |
7.088 | 221500 | 0.0027 | - |
7.104 | 222000 | 0.0026 | - |
7.12 | 222500 | 0.0028 | - |
7.136 | 223000 | 0.0027 | - |
7.152 | 223500 | 0.0028 | - |
7.168 | 224000 | 0.0027 | - |
7.184 | 224500 | 0.0027 | - |
7.2 | 225000 | 0.0028 | - |
7.216 | 225500 | 0.0027 | - |
7.232 | 226000 | 0.0028 | - |
7.248 | 226500 | 0.0027 | - |
7.264 | 227000 | 0.0027 | - |
7.28 | 227500 | 0.0027 | - |
7.296 | 228000 | 0.0027 | - |
7.312 | 228500 | 0.0028 | - |
7.328 | 229000 | 0.0027 | - |
7.344 | 229500 | 0.0028 | - |
7.36 | 230000 | 0.0028 | 0.0027 |
7.376 | 230500 | 0.0028 | - |
7.392 | 231000 | 0.0028 | - |
7.408 | 231500 | 0.0028 | - |
7.424 | 232000 | 0.0027 | - |
7.44 | 232500 | 0.0027 | - |
7.456 | 233000 | 0.0028 | - |
7.4720 | 233500 | 0.0028 | - |
7.4880 | 234000 | 0.0028 | - |
7.504 | 234500 | 0.0028 | - |
7.52 | 235000 | 0.0028 | - |
7.536 | 235500 | 0.0027 | - |
7.552 | 236000 | 0.0027 | - |
7.568 | 236500 | 0.0028 | - |
7.584 | 237000 | 0.0028 | - |
7.6 | 237500 | 0.0027 | - |
7.616 | 238000 | 0.0028 | - |
7.632 | 238500 | 0.0026 | - |
7.648 | 239000 | 0.0027 | - |
7.664 | 239500 | 0.0027 | - |
7.68 | 240000 | 0.0028 | 0.0027 |
7.696 | 240500 | 0.0028 | - |
7.712 | 241000 | 0.0027 | - |
7.728 | 241500 | 0.0028 | - |
7.744 | 242000 | 0.0027 | - |
7.76 | 242500 | 0.0027 | - |
7.776 | 243000 | 0.0027 | - |
7.792 | 243500 | 0.0028 | - |
7.808 | 244000 | 0.0027 | - |
7.824 | 244500 | 0.0027 | - |
7.84 | 245000 | 0.0027 | - |
7.856 | 245500 | 0.0029 | - |
7.872 | 246000 | 0.0028 | - |
7.888 | 246500 | 0.0027 | - |
7.904 | 247000 | 0.0026 | - |
7.92 | 247500 | 0.0027 | - |
7.936 | 248000 | 0.0027 | - |
7.952 | 248500 | 0.0027 | - |
7.968 | 249000 | 0.0028 | - |
7.984 | 249500 | 0.0027 | - |
8.0 | 250000 | 0.0028 | 0.0027 |
Framework Versions
- Python: 3.10.16
- Sentence Transformers: 4.1.0
- Transformers: 4.51.3
- PyTorch: 2.6.0+cu118
- Accelerate: 1.6.0
- Datasets: 3.5.0
- Tokenizers: 0.21.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
ContrastiveLoss
@inproceedings{hadsell2006dimensionality,
author={Hadsell, R. and Chopra, S. and LeCun, Y.},
booktitle={2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)},
title={Dimensionality Reduction by Learning an Invariant Mapping},
year={2006},
volume={2},
number={},
pages={1735-1742},
doi={10.1109/CVPR.2006.100}
}
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for CloudlessSky/fullname_encoder_v1
Base model
intfloat/multilingual-e5-small