数学科学学院

概率统计讨论班

来源:数学科学学院 发布时间:2025-09-17   21

报告题目:ReHLine: Regularized Composite ReLU-ReHU Loss Minimization with Linear  Computation and Linear Convergence

报 告 人:戴奔 助理教授(香港中文大学

  间:2025919日(星期五),上午11:00-12:00

地  点:海纳苑2102

  要:Empirical risk minimization (ERM) is a crucial framework that offers a general approach to handling a broad range of machine learning tasks. In this paper, we propose a novel algorithm, called ReHLine, for minimizing a set of regularized ERMs with convex piecewise linear-quadratic loss functions and optional linear constraints. The proposed algorithm can effectively handle diverse combinations of loss functions, regularizations, and constraints, making it particularly well-suited for complex domain-specific problems. Examples of such problems include FairSVM, elastic net regularized quantile regression, Huber minimization, etc. In addition, ReHLine enjoys a provable linear convergence rate and exhibits a per-iteration computational complexity that scales linearly with the sample size. The algorithm is implemented with both Python and R interfaces, and its performance is benchmarked on various tasks and datasets. Our experimental results demonstrate that ReHLine significantly surpasses generic optimization solvers in terms of computational efficiency on large-scale datasets. Moreover, it also outperforms specialized solvers such as liblinear in SVM, hqreg in Huber minimization and lightning(SAGA, SAG, SDCA, SVRG) in smooth SVM, exhibiting exceptional flexibility and efficiency.


报告人简介:戴奔,香港中文大学统计系的助理教授。他的主要研究兴趣包括统计一致性、理论驱动的机器学习方法、机器学习的理论基础、黑箱显著性检验、统计计算以及软件开发。


Copyright © 2023 华体会网页版    版权所有

    浙ICP备05074421号

技术支持: 寸草心科技     管理登录

    您是第 1000 位访问者