Skip to content
#

nonconvex-optimization

Here are 34 public repositories matching this topic...

Uno
gradient-descent-sgd-solver-course

Stochastic Gradient Descent (SGD) is an optimization algorithm that updates model parameters iteratively using small, random subsets (batches) of data, rather than the entire dataset. It significantly speeds up training for large datasets, though it introduces noise that causes, in some cases, heavy fluctuations.deep learning/neural networks.solver

  • Updated Mar 17, 2026
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the nonconvex-optimization topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the nonconvex-optimization topic, visit your repo's landing page and select "manage topics."

Learn more