S'MoRE: Structural Mixture of Residual Experts for LLM Fine-tuning Paper • 2504.06426 • Published Apr 8, 2025 • 2
S'MoRE: Structural Mixture of Residual Experts for LLM Fine-tuning Paper • 2504.06426 • Published Apr 8, 2025 • 2
view article Article What is MoE 2.0? Update Your Knowledge about Mixture-of-experts Apr 27, 2025 • 10
LLM-Rec: Personalized Recommendation via Prompting Large Language Models Paper • 2307.15780 • Published Jul 24, 2023 • 27
LLM-Rec: Personalized Recommendation via Prompting Large Language Models Paper • 2307.15780 • Published Jul 24, 2023 • 27
Decoupling the Depth and Scope of Graph Neural Networks Paper • 2201.07858 • Published Jan 19, 2022 • 1
GraphSAINT: Graph Sampling Based Inductive Learning Method Paper • 1907.04931 • Published Jul 10, 2019