Hacker News new | past | comments | ask | show | jobs | submit login

Title here is wrong. Title in article and headings in article are right: ONLINE gradient descent

It's specifically not stochastic. From the article:

Online gradient descent

Finally, we have enough experience to implement online gradient descent. To keep things simple, we will use a very vanilla version:

- Constant learning rate, as opposed to a schedule.

- Single epoch, we only do one pass on the data.

- Not stochastic: the rows are not shuffled. ⇠ ⇠ ⇠ ⇠

- Squared loss, which is the standard loss for regression.

- No gradient clipping.

- No weight regularisation.

- No intercept term.




Hehe I was wondering if someone would catch that. Rest assured, I know the difference between online and stochastic gradient descent. I admit I used stochastic on Hacker News because I thought it would generate more engagement.


Then just call it Non-stochastic Gradient Descent? You can't editorialize titles per HN guidelines

https://news.ycombinator.com/newsguidelines.html


Thanks, I wasn't aware.


My pleasure. You can still edit the title, by the way ;-)


What are some adversarial cases for gradient descent, and/or what sort of e.g. DVC.org or W3C PROV provenance information should be tracked for a production ML workflow?

Gradient descent: https://en.wikipedia.org/wiki/Gradient_descent

Stochastic gradient descent: https://en.wikipedia.org/wiki/Stochastic_gradient_descent

Online machine learning: https://en.wikipedia.org/wiki/Online_machine_learning

adversarial gradient descent site:github.com inurl:awesome : https://www.google.com/search?q=awesome+adversarial+gradient...

https://github.com/EthicalML/awesome-production-machine-lear...

Robust machine learning: https://en.wikipedia.org/wiki/Robustness_(computer_science)#...

Robust gradient descent


We built model & data provenance into our open source ML library, though it's admittedly not the W3C PROV standard. There were a few gaps in it until we built an automated reproducibility system on top of it, but now it's pretty solid for all the algorithms we implement. Unfortunately some of the things we wrap (notably TensorFlow) aren't reproducible enough due to some unfixed bugs. There's an overview of the provenance system in this reprise of the JavaOne talk I gave here https://www.youtube.com/watch?v=GXOMjq2OS_c. The library is on GitHub - https://github.com/oracle/tribuo.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: