Gradient descent algorithms such as accelerated gradient descent and stochastic gradient descent are widely employed to solve optimization problems in statistics and machine learning. This talk will present a new asymptotic analysis of these algorithms by continuous-time ordinary or stochastic differential equations. I will illustrate that the analysis can provide a novel unified framework for a joint computational and statistical asymptotic analysis on dynamic behaviors of these algorithms with the number of iterations in the algorithms and large sample behaviors of the optimization solutions (i.e. statistical decision rules like estimators and classifiers) that the algorithms are applied to compute. I will also discuss the implication of the analysis results for deep learning.