Spring 2019   CSCE 636-601   Neural Networks

Location and Hours:

Tuesdays and Thursdays, 11:10am-12:25pm @ Room 124 Bright Building

Instructor:

Prof. Anxiao (Andrew) Jiang, 309B Bright Building. Email: ajiang@cse.tamu.edu

Office hours: 12:30pm-1:30pm on Tuesdays and Thursdays.

TA and Grader:

TA: Xiaojing Yu. Email: vicky_yu@tamu.edu

Office hours: 3:00pm-4:30pm on Wednesdays and Fridays in 514E HRBB.

Grader: Kexin Cui. Email: ckx9411sx@email.tamu.edu

Course Materials:

Textbook 1 (Required): Deep Learning with Python, by Francois Chollet, Manning Publications, December 2017.

Textbook 2 (Recommended): Deep Learning, by Ian Goodfellow, Joshua Bengio and Aaron Courville, MIT Press, November 2016.

Textbook 3 (Recommended): Deep Learning Quick Reference, by Michael Bernico, Packt Publishing, March 2018.

Textbook 4 (Recommended): Neural Networks and Deep Learning, by Charu C. Aggarwal, Springer, September 2018.

Textbook 5 (Recommended): Learning from Data, by Yaser S. Abu-Mostafa and Malik Magnon-Ismail, AMLBook, August 2017.

Grading and Requirements:

Homework: 30%

Project: 70%

Grading: 90 to 100 for A; 80 to 89 for B; 70 to 79 for C; 60 to 69 for D; 0 to 59 for F.

Submission Policy: An electronic copy of each homework/project submission should be turned in in eCampus. For homework assignment, 25% is deducted for each late day for up to three days (including weekends) after which submissions are not accepted. Late project reports will not be accepted.

Homework:

1. Homework assignment one. Due 11:10am on Thursday 1/31/2019 in eCampus.

    (1) Train a neural network for the MNIST Handwritten Digit Recognition task. (You can use the same code we introduced in class, or you can use your own code. You can use Keras or any other platform/library, such as Tensorflow, PyTorch, etc.)

    Turn in: 1) annotated code; (2) training and test performance of your neural network (including loss and accuracy).

    (2) Check out papers at top conferences in deep learning to find applications/topics in deep learning. (Here are some top conferences: CVPR (IEEE Conference on Computer Vision and Pattern Recognition), NIPS (Neural Information Processing Systems), ECCV (European Conference on Computer Vision), ICML (International Conference on Machine Learning), EMNLP (Conference on Empirical Methods in Natural Language Processing), ICLR (International Conference on Learning Representations), AAAI (AAAI Conference on Artificial Intelligence), ICCV (IEEE International Conference on Computer Vision), SIGKDD (ACM SIGKDD International Conference on Knowledge Discovery and Data Mining), RepL4NLP (Workshop on Representation Learning for NLP), ACL (Meetings of the Association for Computational Linguistics), NAACL (Annual Conference of the North American Chapter of the Association for Computational Linguistics, etc. And there are more conferences in specific domains, such as healthcare, etc.)

    Turn in: at least 10 different applications or topics in deep learning. For each application/topic, briefly explain it with a couple of sentences, and cite at least one paper that studies this application/topic.

2. Homework assignment two. Due 11:10am on Thursday 2/7/2019 in eCampus.

    (1) Train neural networks for the three applications we recently covered in class: Movie Review classification using IMDB Dataset, Topic classification using Reuters Dataset, Predicting House Prices using Boston Housing Price Dataset (which is a regression problem).

        You can use the same code we introduced in class (i.e., the code in the textbook), or you can use your own code. You can use Keras or any other platform/library, such as Tensorflow, PyTorch, etc.

        Turn in: 1) annotated codes; (2) training and test performance of your neural networks.

3. Homework assignment three. Due 11:10am on Tuesday 2/19/2019 in eCampus.

    Train neural networks for: (1) Classifying images as dogs or cats (Section 5.2 of textbook 1), (2) VGG16 model (Section 5.3 of textbook 1).

    You can use the same code as in the textbook.

    Turn in: 1) annotated codes; (2) training and test performance of your neural networks.

    Please note that Project Assignment One has been posted, and is due at 11:10am on Monday 2/18/2019.

4. Project assignment two. (See Project page for details.) Due 11:10am on Monday 2/25/2019.

5. Homework assignment four. Due: 11:10am on Tuesday 3/5/2019 in eCampus.

    (1) Train the neural network for "IMDB movie-review classification" problem using an Embedding layer (for word embedding), which is in Listing 6.6 and Listing 6.7 (Section 6.1.2) of Textbook "Deep Learning with Python".

          Turn in: 1) annotated code; (2) training and test performance of your neural network.

    (2) There are two well-known pre-trained word-embedding tools. One is Word2Vec (developed by Google), and the other is GloVe (developed by Stanford University). Given a word, such a tool can return its embedding vector; and give an embedding vector, such a tool can return the nearest word. Both embedding tools were developed based on very large data corpuses, and their embedding reveals many useful structures, which makes them useful for various NLP (natural language processing) applications. Please learn about the two tools (from online resources, etc.).

          Turn in (for both Word2Vec and GloVe): 1) annotated code that, given a word, can return its embedding vector. (2) annotated code that, given an embedding vector, can return the nearest word.

    (3) In Listing 6.22 and Listing 6.23 (section 6.2.1) of the Textbook "Deep Leaning with Python", a neural network of 3 layers -- an embedding layer, a Simple RNN layer, and a fully-connected layer -- is constructed for the IMDB movie-review-classification problem. Each layer outputs a tensor of a certain shape. What are their shapes during training?

          Turn in: the shape of the tensor that is output by each layer during the training process for the neural network, and briefly explain why.

    (4) In Listing 6.27 (section 6.2.3) of the Textbook "Deep Leaning with Python", a neural network of 3 layers -- an embedding layer, an LSTM layer, and a fully-connected layer -- is constructed for the IMDB movie-review-classification problem. Train the neural network yourself.

          Turn in: 1) annotated code; (2) training and test performance of your neural network.

    (5) An RNN can overfit, and we can use dropout to fight overfitting. What is the proper way to use dropout with a recurrent network? (Hint: see Section 6.3.6 of Textbook "Deep Learning with Python".)

          Turn in: describe what the proper way is.

    (6) Data sequences can be processed not only by RNN, but also by 1-dimensional CNN. In Listing 6.46 (section 6.4.3) of textbook "Deep Learning with Python", a 1D CNN is constructed for the IMDB movie-review-classification problem. Train the neural network yourself.

          Turn in: 1) annotated code; (2) training and test performance of your neural network.

6. Project assignment three. (See Project page for details.) Due 11:10am on Thursday 3/7/2019.

7. Project assignment four. (See Project page for details.) Due 11:59pm on Monday 4/1/2019.

8. Project assignment five. (See Project page for details.) Due 11:59pm on Thursday 4/18/2019.


Project:

Project details.


Resources for computing using GPU:

1. HPRC: You can apply for an account at TAMU HPRC (High Performance Research Computing), https://hprc.tamu.edu.

2. Google CoLab: there's an open source Jupyter notebook environment by Google that runs in the cloud and allows us to use GPU resources. It requires no setup. It's a good resource for anyone who wants to do swift experiments in deep learning. Here is its link: https://colab.research.google.com/notebooks/welcome.ipynb#recent=true

Training a deep neural network often requires GPU. (Otherwise, computing can be very slow.) Try to get access to a computer with GPU, or use the above online resources. There are also cloud computing services (such as from Amazon) that provide GPU computing for a cost (which is OK for small experiments, but can be costly for large experiments).


Syllabus:

Date Lectures Reading
1/15/2019 Tuesday
Introduction to Deep Learning [Slides 1]
Chapter 1 of Textbook 1
1/17/2019 Thursday
Mathematical Building Blocks of Neural Networks [Slides 2]
Chapter 2 of Textbook 1
1/22/2019 Tuesday
Mathematical Building Blocks of Neural Networks [Slides 2]
Chapter 2, 3 of Textbook 1
1/24/2019 Thursday
Gradient Descent and Backpropagation Algorithm [Slides 3]
Section 6.5 of Textbook 2
1/29/2019 Tuesday
Getting Started with Neural Networks [Slides 4] Chapter 3 of Textbook 1
1/31/2019 Thursday
Getting Started with Neural Networks [Slides 5] Chapter 3 of Textbook 1
2/5/2019 Tuesday
Getting Started with Neural Networks [Slides 5] Chapter 3 of Textbook 1
2/7/2019 Thursday
Fundamentals of Machine Learning [Slides 6] Deep Learning for Computer Vision [Stanford Lecture] Chapters 4 and 5 of Textbook 1
2/12/2019 Tuesday
Self-study of Chapter 5 "Deep Learning for Computer Vision" Chapter 5 of Textbook 1
2/14/2019 Thursday
Self-study of Chapter 5 "Deep Learning for Computer Vision" Chapter 5 of Textbook 1
2/19/2019 Tuesday
Deep Learning for Computer Vision [Slides 7] Chapter 5 of Textbook 1
2/21/2019 Thursday
Deep Learning for Text and Sequences [Slides 8] Chapter 6 of Textbook 1
2/26/2019 Tuesday
Deep Learning for Text and Sequences [Slides 8] Chapter 6 of Textbook 1. Additional reading on RNN and Vanishing and Exploding Gradients.
2/28/2019 Thursday
Deep Learning for Text and Sequences [Slides 9] Chapter 6 of Textbook 1. Additional reading on RNN and Vanishing and Exploding Gradients.
3/5/2019 Tuesday
Advanced Deep-learning Best Practices [Slides 10] Chapter 7 of Textbook 1
3/7/2019 Thursday
Deep Reinforcement Learning [Slides 11]
3/12/2019 Tuesday
No class due to spring break.
3/14/2019 Thursday
No class due to spring break.
3/19/2019 Tuesday
Deep Reinforcement Learning [Slides 12]
3/21/2019 Thursday
Deep Reinforcement Learning [Slides 13]
3/26/2019 Tuesday
Deep Reinforcement Learning [Slides 14]
3/28/2019 Thursday
Deep Reinforcement Learning [Slides 14]
4/2/2019 Tuesday
Deep Reinforcement Learning [Slides 15]
4/4/2019 Thursday
Auto-Encoder [Slides 16] Chapter 8 of Textbook 1
4/9/2019 Tuesday
VAE and GAN [Slides 17] Chapter 8 of Textbook 1
4/11/2019 Thursday
Transfer Learning [Slides 18]
4/16/2019 Tuesday
Transfer Learning [Slides 18] and Ensemble Learning [Slides 19]
4/18/2019 Thursday
Project due.
4/23/2019 Tuesday
Ensemble Learning [Slides 19]
4/25/2019 Thursday
Summary [Slides 20]
4/30/2019 Tuesday
No class. (Redefined day for Friday)



Statement: The Americans with Disabilities Act (ADA) is a federal anti-discrimination statute that provides comprehensive civil rights protection for persons with disabilities. Among other things, this legislation requires that all students with disabilities be guaranteed a learning environment that provides for reasonable accommodation of their disabilities. If you believe you have a disability requiring an accommodation, please contact Disability Services, currently located in the Disability Services building at the Student Services at White Creek complex on west campus or call 979-845-1637. For additional information, visit http://disability.tamu.edu.

“An Aggie does not lie, cheat or steal, or to tolerate those who do.” See http://aggiehonor.tamu.edu