Showing 1–11 of 11 results
/ Date/ Name
Jun 11, 2025DIVE into MoE: Diversity-Enhanced Reconstruction of Large Language Models from Dense into Mixture-of-ExpertsJul 10, 2024Invisible sweat sensor: ultrathin membrane mimics skin for stress monitoringDec 11, 2025Blink: Dynamic Visual Token Resolution for Enhanced Multimodal UnderstandingSep 26, 2025Elastic MoE: Unlocking the Inference-Time Scalability of Mixture-of-ExpertsJan 6, 2026MemRL: Self-Evolving Agents via Runtime Reinforcement Learning on Episodic MemoryOct 16, 2024Meta-Chunking: Learning Text Segmentation and Semantic Completion via Logical PerceptionMar 5, 2026Mixture of Universal Experts: Scaling Virtual Width via Depth-Width TransformationSep 15, 2025CBP-Tuning: Efficient Local Customization for Black-box Large Language ModelsOct 16, 2024FTII-Bench: A Comprehensive Multimodal Benchmark for Flow Text with Image InsertionMay 28, 2025Xinyu AI Search: Enhanced Relevance and Comprehensive Results with Rich Answer PresentationsJul 9, 2025A derivative-free Levenberg-Marquardt method for sparse nonlinear least squares problems