The 8088 The 8088 ← All news
arXiv cs.LG AI Research Apr 21

Lower Bounds and Proximally Anchored SGD for Non-Convex Minimization Under Unbounded Variance

★★★★★ significance 2/5

The paper analyzes the complexity of non-convex optimization under the Blum-Gladyshev condition, which allows for unbounded variance. It establishes new information-theoretic lower bounds and introduces the Proximally Anchored Stochastic Approximation (PASTA) framework to mitigate variance explosion.

Why it matters Establishing theoretical limits for optimization under unbounded variance addresses fundamental stability concerns in training large-scale non-convex models.
Read the original at arXiv cs.LG

Tags

#optimization #sgd #non-convex #stochastic gradient descent #complexity

Related coverage