This repository contains materials for employees at SimsReality Corporation to learn AI/ML technologies. Deep Learning is regarded as a panacea in this era. In this course, you will learn the foundations of deep learning, understand how to interprete others' neural networks, and learn how to build your own neural networks. You will also learn about popular neural network structures including convolutional neural networks, Recurrent NNs and transformers In addition, you will learn about Deep learning based Natural Language Processing techniques including word embeddings, BERT and GPT-like LLM(Large Language Model)s.
- This course meets for in-class lecture.
- For all inquiries related to this course, please contact kyongha AT kisti DOT re DOT kr.
- Dr. Kyong-Ha Lee AT KISTI(Korea Institute of Science and Technology Information)
- Book: Neural networks and deep learning: a textbook by Charu C. Aggarwal(but not mandatory)
- All slides for this class will be available here.
- Some papers in a reading list will be reviewed if time is available.
- All course announcements take place though this page. Please check this webpage frequently.
- This course has the following components:
- In-class lecture (60~70%)
- Code reivew and excercise (30~40%)
| Event | Date | In-class lecture | Materials and Exercises |
|---|---|---|---|
| Lecture 1 | Aug. 28 | Basic Concepts for understanding AI, ML | |
| Lecture 2 | Aug. 29 | An Introduction to neural networks I
|
|
| Lecture 3 | Sep. 11 | An Introduction to neural networks II
|
Prediction of housing price by using regression techniques Diabetes classification using NN |
| Lecture 4 | Sep. 12 | Convolutional Neural Network I | |
| Lecture 5 | Sep. 30 | Convolutional Neural Network II | Character Recognition with MNIST dataset Detecting Pneumonia with X-ray images |
| Lecture 6 | Oct. 02 | Recurrent Neural Network | |
| Lecture 7 | Oct 07 | RNN and Attention Mechanisms | RNN examples |
| Lecture 8 | Oct. 10 | Natural Language Processing with DL I
|
|
| Lecture 9 | Oct. 14 | Natural Language Processing with DL II
|
|
| Lecture 10 | Oct. 16 | Advanced topics and applications |
- [Ioffe15a] Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift, In Proceedings of ICLR 2015.
- (optional)[Ba2016a] Layer Normalization, arXiv:1607.06450v1, 2016.
- (optional)[Glorot10a]Understanding the difficulty of training deep feedforward neural networks, In proceedings of AISTATS 2010.
- (optional)[He2015a] Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification, In Proceedings of ICCV 2015.
- [Hinton15a] Distilling the Knowledge in a Neural Network, arXiv:1503.02531v1, 2015.
- [Alex12a] ImageNet Classification with Deep Convolutional Neural Networks , Advances in Neural Informaiton Processing Systems 25:1098-1105, 2012
- [He16a] Deep Residual Learning for Image Recognition, In Proceedings of CVPR 2016.
- [He16b] Identity mapping in deep residual networks, In Proceedings of ECCV 2016.
- [Huang17a] Densely connected convolutional networks, In Proceedings of CVPR 2017.
- [Hu2018]Squeeze-and-excitation networks, In Proceedings of CVPR 2018.
- [Kipf2017]Semi-supervised classification with graph convolutional networks, In Proceedings of ICLR 2017
- [Mikolov13a] Distributed Representations of Words and Phrases and their Compositionality, Advances in Neural Information Processing Systems 26 (2013): 3111-3119.
- [Rong2016a] word2vec Parameter Learning Explained, arXiv:1411.2738v4
- [Le14a]Distributed Representations of Sentences and Documents, In Proceedings of the ICML 2014.
- [Bojanowski17a] Enriching word vectors with subword information, Transactions of the Association for Computational Linguistics 5 (2017): 135-146.
- [Grover16a] node2vec: Scalable Feature Learning for Networks, In Proceedings of KDD 2016.
- [Badahnau16a] NEURAL MACHINE TRANSLATION BY JOINTLY LEARNING TO ALIGN AND TRANSLATE, In Proceedings of ICLR 2015
- [Vaswani17a] Attention is All You Need, In Proceedings of NeurIPS 2017
- [Devlin19a] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, In Proceedings of ACL 2019