Optimization for Data Science

Convex sets; Convex functions; Optimization problem; Descending gradient method; Subgradient method; Newton's method; Linear and quadratic programming; Duality; Optimality conditions of Karush-Kuhn-Tucker. Interior point methods; Coordinate descent; Conjugated gradient; Trust region; Stochastic descending gradient. Applications in statistics and machine learning problems.

Basic Information

Workload
60 hours
Requirements
Linear Algebra, Probability Theory and Multiple Variables Calculation

Mandatory: 

  • Boyd, Stephen, and Lieven Vanderberghe. Convex Optimization. Cambridge, UK: Cambridge University Press, 2004.
  • Izmailov, Alexey, and Mikhail Solodov. Optimization, Volume 1: Optimality Conditions, Elements of Convex and Duality Analysis. IMPA. 2009.
  • Izmailov, A., and M. SOLODOV. Optimization Volume 2: Optimality Conditions, Computational Methods. 2014.

Complementary: 

  • Kecman, Vojislav. Learning and soft computing: support vector machines, neural networks, and fuzzy logic models. MIT press, 2001.
  • Hastie, T., R. Tibshirani, and J. Friedman. "The elements of statistical learning" Springer. 2009.
  • J. Nocedal and S. J. Wright, Numerical Optimization, 2nd Ed, Springer, 2006.
  • P. Pedregal, Introduction to Optimization, Springer, 2004.
  • Bubeck, Sébastien. "Convex optimization: Algorithms and complexity." Foundations and Trends® in Machine Learning 8.3-4 (2015): 231-357.
A A A
High contrast

Esse site usa cookies

Nosso website coleta informações do seu dispositivo e da sua navegação e utiliza tecnologias como cookies para armazená-las e permitir funcionalidades como: melhorar o funcionamento técnico das páginas, mensurar a audiência do website e oferecer produtos e serviços relevantes por meio de anúncios personalizados. Para mais informações, acesse o nosso Aviso de Cookies e o nosso Aviso de Privacidade.