• A
  • A
  • A
  • АБB
  • АБB
  • АБB
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта
Книга
Integral Robot Technologies and Speech Behavior

Kharlamov A. A., Pantiukhin D., Borisov V. et al.

Newcastle upon Tyne: Cambridge Scholars Publishing, 2024.

Статья
On Trees with a Given Diameter and the Extremal Number of Distance-k Independent Sets

D. S. Taletskii.

Journal of Applied and Industrial Mathematics. 2023. Vol. 17. No. 3. P. 664-677.

Глава в книге
Uncertainty of Graph Clustering in Correlation Block Model

Artem Aroslankin, Valeriy Kalyagin.

In bk.: Mathematical Optimization Theory and Operations Research: Recent Trends. 22nd International Conference, MOTOR 2023, Ekaterinburg, Russia, July 2–8, 2023, Revised Selected Papers, vol 1881. Springer, 2023. P. 353-356.

Препринт
Independent sets versus 4-dominating sets in outerplanar graphs

Taletskii D.

math. arXiv. Cornell University, 2023

Introduction to neural network and machine translation

2019/2020
Учебный год
ENG
Обучение ведется на английском языке
6
Кредиты

Преподаватель

Дурандин Олег Владимирович

Дурандин Олег Владимирович

Course Syllabus

Abstract

The course introduces students to the basic concepts of data analysis and machine learning and the application of data mining and machine learning to solve practical problems in the professional field
Learning Objectives

Learning Objectives

  • The purpose of learning is development of skills to conduct research, including problem analysis, setting goals and objectives, understanding of the object and subject of study, choice of method and research methods, as well as evaluating its quality.
Expected Learning Outcomes

Expected Learning Outcomes

  • Has an idea of the basic concepts
  • Has an idea about the regularization features
  • Has an idea of principles of building neural networks
  • Has an idea of the principles of neural networks training
  • Has an idea of the features of convolutional networks
  • Able to work with python ibraries
Course Contents

Course Contents

  • The simplest methods of machine learning
    Applications of machine learning tasks. Classification problem. The K-NN method. Selection of model parameters by cross-validation. Linear classifier.
  • The loss function. Regularization. Optimization.
    Logistic loss function. Quadratic, exponential and piecewise-linear loss function.
  • Method of back propagation. Neural networks.
    Stochastic gradient descent. The simplest perceptron. Activation function.
  • Training of neural networks. Normalization methods.
    Training of neural networks. Normalization methods.
  • Convolutional neural network architectures.
    The concept of convolution. Pooling. Fully connected layer.
  • Libraries for training and running neural networks.
    Libraries “Keras", "TenzorFlow".
Assessment Elements

Assessment Elements

  • non-blocking Control work
  • non-blocking Laboratory work 1
  • non-blocking Laboratory work 2
  • non-blocking Exam
    Итоговый контроль в 2019/2020 учебном году состоялся в 3 модуле.
Interim Assessment

Interim Assessment

  • Interim assessment (3 module)
    0.3 * Control work + 0.4 * Exam + 0.15 * Laboratory work 1 + 0.15 * Laboratory work 2
Bibliography

Bibliography

Recommended Core Bibliography

  • Kelleher, J. D. (2019). Deep Learning. Cambridge: The MIT Press. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=2234376

Recommended Additional Bibliography

  • Антонио Джулли, Суджит Пал - Библиотека Keras – инструмент глубокого обучения. Реализация нейронных сетей с помощью библиотек Theano и TensorFlow - Издательство "ДМК Пресс" - 2018 - 294с. - ISBN: 978-5-97060-573-8 - Текст электронный // ЭБС ЛАНЬ - URL: https://e.lanbook.com/book/111438