• A
  • A
  • A
  • АБB
  • АБB
  • АБB
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта
Книга
Integral Robot Technologies and Speech Behavior

Kharlamov A. A., Pantiukhin D., Borisov V. et al.

Newcastle upon Tyne: Cambridge Scholars Publishing, 2024.

Статья
О количестве k-доминирующих независимых множеств в планарных графах

Талецкий Д. С.

Дискретный анализ и исследование операций. 2024. Т. 31. № 1. С. 109-128.

Глава в книге
Neural Networks for Speech Synthesis of Voice Assistants and Singing Machines

Pantiukhin D.

In bk.: Integral Robot Technologies and Speech Behavior. Newcastle upon Tyne: Cambridge Scholars Publishing, 2024. Ch. 9. P. 281-296.

Introduction to neural network and machine translation

2020/2021
Учебный год
ENG
Обучение ведется на английском языке
8
Кредиты

Преподаватель

Дурандин Олег Владимирович

Дурандин Олег Владимирович

Course Syllabus

Abstract

The course introduces students to the basic concepts of data analysis and machine learning and the application of data mining and machine learning to solve practical problems in the professional field
Learning Objectives

Learning Objectives

  • The purpose of learning is development of skills to conduct research, including problem analysis, setting goals and objectives, understanding of the object and subject of study, choice of method and research methods, as well as evaluating its quality.
Expected Learning Outcomes

Expected Learning Outcomes

  • Has an idea of the basic concepts
  • Has an idea about the regularization features
  • Has an idea of principles of building neural networks
  • Has an idea of the principles of neural networks training
  • Has an idea of the features of convolutional networks
  • Able to work with python ibraries
Course Contents

Course Contents

  • The simplest methods of machine learning
    Applications of machine learning tasks. Classification problem. The K-NN method. Selection of model parameters by cross-validation. Linear classifier.
  • The loss function. Regularization. Optimization.
    Logistic loss function. Quadratic, exponential and piecewise-linear loss function.
  • Method of back propagation. Neural networks.
    Stochastic gradient descent. The simplest perceptron. Activation function.
  • Training of neural networks. Normalization methods.
    Training of neural networks. Normalization methods.
  • Convolutional neural network architectures.
    The concept of convolution. Pooling. Fully connected layer.
  • Libraries for training and running neural networks.
    Libraries “Keras", "TenzorFlow".
Assessment Elements

Assessment Elements

  • non-blocking Control work
  • non-blocking Laboratory work 1
  • non-blocking Laboratory work 2
  • non-blocking Exam
    Итоговый контроль в 2019/2020 учебном году состоялся в 3 модуле.
Interim Assessment

Interim Assessment

  • Interim assessment (3 module)
    0.3 * Control work + 0.4 * Exam + 0.15 * Laboratory work 1 + 0.15 * Laboratory work 2
Bibliography

Bibliography

Recommended Core Bibliography

  • Kelleher, J. D. (2019). Deep Learning. Cambridge: The MIT Press. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=2234376

Recommended Additional Bibliography

  • Антонио Джулли, Суджит Пал - Библиотека Keras – инструмент глубокого обучения. Реализация нейронных сетей с помощью библиотек Theano и TensorFlow - Издательство "ДМК Пресс" - 2018 - 294с. - ISBN: 978-5-97060-573-8 - Текст электронный // ЭБС ЛАНЬ - URL: https://e.lanbook.com/book/111438