VAE Layer for the Research Gated Latent Reasoning Loop (tentative name)

Please refer to our code: https://github.com/elliot-zzh/from-transparent-to-opaque. The project is under construction, and we will publish the paper once we are ready.

This is the pretrained VAE layer for the research Gated Latent Reasoning Loop (tentative name).

There are 3 VAEs, and they are applied to different models:

The structure of the VAE involves two linear layers, including the compressor and the uncompressor.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ethangoh7086cmd/gated-latent-reasoning-loop-vae

Base model

Qwen/Qwen2.5-1.5B
Finetuned
(815)
this model

Dataset used to train ethangoh7086cmd/gated-latent-reasoning-loop-vae