• A
  • A
  • A
  • АБB
  • АБB
  • АБB
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта
Книга
Integral Robot Technologies and Speech Behavior

Kharlamov A. A., Pantiukhin D., Borisov V. et al.

Newcastle upon Tyne: Cambridge Scholars Publishing, 2024.

Статья
On Trees with a Given Diameter and the Extremal Number of Distance-k Independent Sets

D. S. Taletskii.

Journal of Applied and Industrial Mathematics. 2023. Vol. 17. No. 3. P. 664-677.

Глава в книге
Uncertainty of Graph Clustering in Correlation Block Model

Artem Aroslankin, Valeriy Kalyagin.

In bk.: Mathematical Optimization Theory and Operations Research: Recent Trends. 22nd International Conference, MOTOR 2023, Ekaterinburg, Russia, July 2–8, 2023, Revised Selected Papers, vol 1881. Springer, 2023. P. 353-356.

Препринт
Independent sets versus 4-dominating sets in outerplanar graphs

Taletskii D.

math. arXiv. Cornell University, 2023

Introduction to neural network and machine translation

2023/2024
Учебный год
ENG
Обучение ведется на английском языке
8
Кредиты

Преподаватели

Course Syllabus

Abstract

The course introduces basic concepts of neural networks, deep learning and machine translation.
Learning Objectives

Learning Objectives

  • The purpose of the ciyrse is to develop the ability to use neural network in their research and applied projects.
Expected Learning Outcomes

Expected Learning Outcomes

  • Is able to use word embedding models
  • Is able to use supervised learning
  • Understands the advantages and disadvantages of neural networks
  • Can create and use convolutional neural networks
  • Can create and use recurrent neural networks
  • Can create and use attention-based neural networks
  • Can pretrain and fine-tune neural networks and their components
  • Understands the principles of large language models and knows how to use them to solve applied problems
Course Contents

Course Contents

  • Word embedding, word2vec model
  • Supervised learning, logistic regression, multilayer perceptron
  • Overfitting problem, regularization
  • Convolutional neural networks
  • Recurrent neural networks, Seq2seq modeling
  • Attention-based models, Transformers
  • Pretraining and fine-tuning, BERT, GPT
  • Large language models, Prompt engineering, Chain-of-thought
Assessment Elements

Assessment Elements

  • non-blocking Lab 1
  • non-blocking Lab 2
  • non-blocking Lab 3
  • non-blocking Lab 4
  • non-blocking Project
  • non-blocking Quizzes
    Test tasks that are given at every lecture except the first.
Interim Assessment

Interim Assessment

  • 2023/2024 3rd module
    Score = 0.15*Lab 1 + 0.15*Lab 2 + 0.2*Lab 3 + 0.3*Lab 4 + 0.1*Project + 0.1*Quizzes Final score = 0.8*Score + 0.2*Bonus Bonus: A bonus point is awarded if a student has voluntarily gone beyond the scope of the discipline. An example of such work is the application of a method considered as part of an individual project on other data, or the adaptation of a basic model from laboratory 4 with ideas from a recent research on neural networks. The work that can be evaluated for a bonus point is previously agreed with the teacher.A student gets a maximum score of 10 if he/she has successfully completed at least two works that deserve a bonus.
Bibliography

Bibliography

Recommended Core Bibliography

  • Kelleher, J. D. (2019). Deep Learning. Cambridge: The MIT Press. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=2234376

Recommended Additional Bibliography

  • Ian Goodfellow and Yoshua Bengio and Aaron Courville. Deep Learning, 2016. URL: http://www.deeplearningbook.org
  • Ian Goodfellow, Yoshua Bengio, & Aaron Courville. (2016). Deep Learning. The MIT Press.
  • Николенко С., Кадурин А., Архангельская Е. - Глубокое обучение. — (Серия «Библиотека программиста») - 978-5-4461-1537-2 - Санкт-Петербург: Питер - 2020 - 377026 - https://ibooks.ru/bookshelf/377026/reading - iBOOKS