A short course on neural networks (Tsinghua)

This short course introduces the use of artificial neural networks in machine learning. The course is aimed at engineers and natural science students. The focus is on supervised learning with multi-layer perceptron networks, because this method has recently become very popular in science and technology. I describe the network layout, how to train such networks with stochastic gradient descent, and describe recent developments in the field (deep learning). I conclude with a discussion of current questions and applications. This course is based on Chapters 5 to 7 of Artificial Neural Networks (link below). I also offer short projects or homework problems that illustrate the learning goals. These will be made available in the online system OpenTA.

 

Registration

Note. Preliminary registration through the OpenTA site is now closed. If you have not registered yet, but would still like to participate, please send an email to Bernhard.Mehlig.at.physics.gu.se with the subject line Registration ANN This email address is being protected from spambots. You need JavaScript enabled to view it.


Prerequisites

Basic linear algebra, analysis, and programming. Test your skills with a quiz at the OpenTA site. This link is to the OpenTA online system that we will use in this course. The system was developed by Stellan Östlund and Hampus Linander. 

Please register here before the first lecture.

Contents and schedule

Eight lectures:     Fri   Jan 18   9:30-10:15 and 10:25-11:10 (Perceptrons)
                                            14:30-15:15 and 15:25-16:10 (Stochastic gradient descent)
                         Sat  Jan 19   9:30-10:15 and 10:25-11:10 (Deep learning)
                                            14:30-15:15 and 15:25-16:10 (Outlook and applications)

Literature

B. Mehlig, Artificial Neural Networks. Link to v28 December  (2018).

Further literature

J. Hertz, A. Krogh & J. Palmer, Introduction to the theory of neural computation, Addison-Wesley.
S. Haykin, Neural Networks: a comprehensive foundation, Prentice Hall, New Jersey.
I. Goodfellow, Y. Bengio & A. Courville, Deep Learning, MIT Press.
Y. LeCun, Y. Bengio & G. Hinton, Deep learning, Nature 521 (2015) 436–444.
M. Nielsen, Neural Networks and Deep Learning.

Homework

Same rules as for written exams apply: it is not allowed to copy any material from anywhere unless reference is given. All students must write their own computer programs and submit their own solutions. All submissions are handled via OpenTA. Your program code must be uploaded to OpenTA. MATLAB 2017b is recommended.

Deadlines are sharp. Late submissions are not accepted.