graduate student descent

From Wiktionary, the free dictionary
Jump to navigation Jump to search



Modeled on gradient descent. Attributed to David A. McAllester.[1][2]


graduate student descent (uncountable)

  1. (machine learning, humorous) The process of choosing hyperparameters manually and in an ad-hoc manner, typical of work assigned to a graduate student.
    • [2011 April 6, Kat Scott, Twitter[3], archived from the original on 2022-07-04:
      Phrase of the day, "Graduate Student Descent." Optimization through graduate student tweaking.]
    • 2019, Chip Huyen, “Design a machine learning system”, in Chip Huyen [personal website][4], retrieved 2021-09-27:
      [] people without real-world experience often ignore systematic approaches to hyperparameter tuning in favor of manual, gut-feeling approach. The most popular method is arguably 'Graduate Student Descent (GSD) []
    • 2019 April 17, Oguzhan Gencoglu with Mark van Gils, Esin Guldogan, Chamin Morikawa, Mehmet Süzen, Mathias Gruber, Jussi Leinonen, and Heikki Huttunen, “HARK Side of Deep Learning - From Grad Student Descent to Automated Machine Learning”, in arXiv[5], page 2:
      Instead of hypothesis-forming based on theory, extensive research on previous studies and/or reflection against the existing domain knowledge, grad student descent [] is applied.


  1. ^ Nicolas Pinto (2011 December 16) “High-Performance Computing Needs Machine Learning... And Vice Versa”, in SlideShare[1]:"This is graduate student descent" - David McAllester
  2. ^ Tim Vieira (2015 April 29) Twitter[2], retrieved 2021-09-27:@bmorphism @ryan_p_adams I got "Graduate Descent" from David McAllester in 2011.