Build Your Own Neural Networks
Course Description
In this course, we will build a toy deep learning framework from scratch (tensorslow😅), and students could understand the essence of deep learning through such experience. This course is a project seminar for senior undergraduate students, and highlights learning by doing.
Instructor: Meng Li
Time: Fridays, 10:00 - 15:00 (Oct 18, 25; Nov 08, 15; Dec 13)
Room: 2.14.0.35
Course Management System: Moodle
Syllabus
- Day 1 (2024-10-18):
- 10:15 - 12:00. From Perceptron to Feed-Forward Neural Networks
- 13:00 - 15:00. Backpropagation and Learning Tricks
- Day 2 (2024-10-25):
- 10:15 - 12:00. Automatic Differentiation
- 13:00 - 15:00. Expand and Optimize Code
- Day 3 (2024-11-08):
- 10:15 - 12:00. Calculate Higher-order Derivatives
- 13:00 - 15:00. Create Neural Networks
- Day 4 (2024-11-15):
- 10:15 - 12:00. RNN
- 13:00 - 15:00. Project Discussion
- Day 5 (2024-12-13):
- 10:15 - 12:00. Project Presentation
- 13:00 - 15:00. Project Presentation
Requirement
Prerequisites. You are expected to be proficient in Python programming.
Format. In the first three blocks, I will introduce the basic knowledge of neural networks and how to build a toy deep learning framework step-by-step. In the fourth block, I will introduce the theory of RNN, and you are expected to build RNNs from scratch (with python and tensorslow in this course) individually. In the final block, each student should present their project and explain the code. I will ask random questions to check if you really understand your code.
Grading
This seminar belongs to “Methods of computational linguistics and subject hours/Methoden der Computerlinguistik und Ver-suchspersonenstunden (LIN-BS-070)”.
-
To complete this seminar, you are exptected to complete a software project on implementing RNNs, and give a presentation to explain your code at the end of this course.
-
(Optional) To pass the LIN-BS-070 module examination, you need to complete a term paper related to one of two seminars. If you would like to write a term paper, please talk to me.
Resource
- Alice’s Adventures in a differentiable wonderland An intuitive, well-motivated reader on neural networks and their math foundation.
- Mathematical Engineering of Deep Learning More rigorous math description of deep learning.
Contact
Feel free to contact Meng at meng.li (at) uni-potsdam.de for any questions.