Academic Neural Thickets: Diverse Task Experts Are Dense Around Pretrained Weights Paper • 2603.12228 • Published Mar 12 • 12 Meta-Reinforcement Learning with Self-Reflection for Agentic Search Paper • 2603.11327 • Published Mar 11 • 10 Training Language Models via Neural Cellular Automata Paper • 2603.10055 • Published Mar 9 • 8 Attention Sinks Are Provably Necessary in Softmax Transformers: Evidence from Trigger-Conditional Tasks Paper • 2603.11487 • Published Mar 12 • 2
Neural Thickets: Diverse Task Experts Are Dense Around Pretrained Weights Paper • 2603.12228 • Published Mar 12 • 12
Meta-Reinforcement Learning with Self-Reflection for Agentic Search Paper • 2603.11327 • Published Mar 11 • 10
Attention Sinks Are Provably Necessary in Softmax Transformers: Evidence from Trigger-Conditional Tasks Paper • 2603.11487 • Published Mar 12 • 2
Academic Neural Thickets: Diverse Task Experts Are Dense Around Pretrained Weights Paper • 2603.12228 • Published Mar 12 • 12 Meta-Reinforcement Learning with Self-Reflection for Agentic Search Paper • 2603.11327 • Published Mar 11 • 10 Training Language Models via Neural Cellular Automata Paper • 2603.10055 • Published Mar 9 • 8 Attention Sinks Are Provably Necessary in Softmax Transformers: Evidence from Trigger-Conditional Tasks Paper • 2603.11487 • Published Mar 12 • 2
Neural Thickets: Diverse Task Experts Are Dense Around Pretrained Weights Paper • 2603.12228 • Published Mar 12 • 12
Meta-Reinforcement Learning with Self-Reflection for Agentic Search Paper • 2603.11327 • Published Mar 11 • 10
Attention Sinks Are Provably Necessary in Softmax Transformers: Evidence from Trigger-Conditional Tasks Paper • 2603.11487 • Published Mar 12 • 2