morinoko-inari's picture
Update Readme
0abaea9
metadata
library_name: transformers
license: apache-2.0
base_model: Helsinki-NLP/opus-mt-ja-en
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: ruby-rails-fine-tuned-ja-en
    results: []

ruby-rails-fine-tuned-ja-en

This model is a fine-tuned version of Helsinki-NLP/opus-mt-ja-en. It achieves the following results on the evaluation set:

  • Loss: 1.6714
  • Bleu: 0.3300
  • Chrf: 64.8493

Model description

Fine-tuned for Japanese to English translation of the Ruby and Rails documentation.

Intended uses & limitations

The training dataset (see below) is still very small, so only minor improvements are expected.

Training and evaluation data

Trained on a custom dataset specifically based on Ruby and Ruby on Rails documentation: https://huggingface.co/datasets/morinoko-inari/ruby-rails-ja-en

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss Bleu Chrf
No log 1.0 23 1.7593 0.3283 64.8523
No log 2.0 46 1.6915 0.3260 64.6705
No log 3.0 69 1.6714 0.3300 64.8493

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0
  • Datasets 3.5.0
  • Tokenizers 0.21.1