Content area

Abstract

This paper proposes a stochastic proximal point method to solve a stochastic convex composite optimization problem. High probability results in stochastic optimization typically hinge on restrictive assumptions on the stochastic gradient noise, for example, sub-Gaussian distributions. Assuming only weak conditions such as bounded variance of the stochastic gradient, this paper establishes a low sample complexity to obtain a high probability guarantee on the convergence of the proposed method. Additionally, a notable aspect of this work is the development of a subroutine to solve the proximal subproblem, which also serves as a novel technique for variance reduction.

Details

1009240
Title
Variance Reduction and Low Sample Complexity in Stochastic Optimization via Proximal Point Method
Publication title
arXiv.org; Ithaca
Publication year
2024
Publication date
Feb 14, 2024
Section
Computer Science; Mathematics; Statistics
Publisher
Cornell University Library, arXiv.org
Source
arXiv.org
Place of publication
Ithaca
Country of publication
United States
University/institution
Cornell University Library arXiv.org
e-ISSN
2331-8422
Source type
Working Paper
Language of publication
English
Document type
Working Paper
Publication history
 
 
Online publication date
2024-02-15
Milestone dates
2024-02-14 (Submission v1)
Publication history
 
 
   First posting date
15 Feb 2024
ProQuest document ID
2926944603
Document URL
https://www.proquest.com/working-papers/variance-reduction-low-sample-complexity/docview/2926944603/se-2?accountid=208611
Full text outside of ProQuest
Copyright
© 2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2024-02-16
Database
ProQuest One Academic