S'MoRE: Structural Mixture of Residual Experts for LLM Fine-tuning Paper • 2504.06426 • Published Apr 8 • 2
view article Article What is MoE 2.0? Update Your Knowledge about Mixture-of-experts By Kseniase and 1 other • Apr 27 • 9
LLM-Rec: Personalized Recommendation via Prompting Large Language Models Paper • 2307.15780 • Published Jul 24, 2023 • 27