1.5B model please
Please can I use the 1.5B model. It will ease development, I can use a lower VRAM computer and work faster with smaller model before deployment with a larger model.
Hi! We have no plans to release the 1.5B model as it hasn't been well optimized. Since we've found that the BAGEL-1.5B model has difficulties with image manipulation, we allocated more resources to optimize the 7B model.
That is completely fine. I do not mind that the model is not well optimized.
It is understood that your 7B model is much better.
Please consider releasing it to assist developers. π
--
I want to develop software that uses the 7B model.
To work with the 7B model I must rent a large powerful GPU,
To work with the 1B model - during development - I can do it on my home computer. Then deploy with rented GPU, saving money and time. Makes life a lot easier. I hope my use case makes sense!
its like flux, we have to use gguf versions
can you guys release quants or something to run on 32gb or 24gb or 16gb of vram so consumers can use this ?
requests for ggufs and quants should be in a separate thread.
I would like to run this myself on my computer 24 GB VRAM.
+1 on the 1.5B model! Would make development much easier. Poor performance is ok