python Scripts/ct2-transformers-converter.exe --model ./nllb-200-distilled-600M --output_dir ./nllb-200-distilled-600M-ct2-float16 --quantization float16 --force
- Downloads last month
- 11
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for Derur/nllb-200-distilled-600M-ct2-float16
Base model
facebook/nllb-200-distilled-600M