S'MoRE: Structural Mixture of Residual Experts for LLM Fine-tuning Paper • 2504.06426 • Published Apr 8 • 1
S'MoRE: Structural Mixture of Residual Experts for LLM Fine-tuning Paper • 2504.06426 • Published Apr 8 • 1
view article Article What is MoE 2.0? Update Your Knowledge about Mixture-of-experts By Kseniase and 1 other • 21 days ago • 8
LLM-Rec: Personalized Recommendation via Prompting Large Language Models Paper • 2307.15780 • Published Jul 24, 2023 • 27
LLM-Rec: Personalized Recommendation via Prompting Large Language Models Paper • 2307.15780 • Published Jul 24, 2023 • 27
GraphSAINT: Graph Sampling Based Inductive Learning Method Paper • 1907.04931 • Published Jul 10, 2019