Over the last few years, deep neural networks (DNNs) have fundamentally transformed the way people think of machine learning and approach practical problems. Successes around DNNs have ranged from traditional AI fields such as computer vision, natural language processing, interactive games, to healthcare, physical sciences—touching each and every corner of theoretical and applied domains. On the other hand, DNNs still largely operate as black-boxes and we only have very limited understanding as for when and why they work. This course introduces basic ingredients of DNNs, samples important applications, and throws around open problems. Emphasis is put on thinking from first principles and basic building blocks, as the field is still evolving rapidly and there is nothing there that cannot be changed.

This new course is a polished version of the topics course 5980/8980: Think Deep Learning that I have developed and taught twice during 2020. Check out the linked course webpage to see what we cover and the flavor. In general, we expect you can easily translate mathematical ideas and descriptions into Python codes—that’s why CSCI 5521 is a prerequisite. We’ll spend most of the lecture time introducing and explaining the key ideas.

Full syllabus: TBA

Instructor: Professor Ju Sun Email: jusun AT umn.edu (Office Hours: TBA)

When/Where: Tue 6:30–9:00pm @ KHKH 3-210

TA’s: TBA

Registration Q&A

  • What’s the next iteration of this course?

Fall 2023 or Spring 2024. It’s unlikely to be offered in Spring 2023 as is currently slated.

  • Style of the course. More programming, foundations, or mathematics?

Check out the previous iteration 5980/8980: Think Deep Learning, including syllabus, slides, and homework sets. This course is a polished version of that.

Lecture Schedule