Natural Language Processing (NLP) is one of the most important fields in Artificial Intelligence (AI). It has become very crucial in the information age because most of the information is in the form of unstructured text. NLP technologies are applied everywhere as people communicate mostly in language: language translation, web search, customer support, emails, forums, advertisement, radiology reports, to name a few.
There are a number of core NLP tasks and machine learning models behind NLP applications. Deep learning, a sub-field of machine learning, has recently brought a paradigm shift from traditional task-specific feature engineering to end-to-end systems and has obtained high performance across many different NLP tasks and downstream applications. Tech companies like Google, Baidu, Alibaba, Apple, Amazon, Facebook, Tencent, and Microsoft are now actively working on deep learning methods to improve their products. For example, Google recently replaced its traditional statistical machine translation and speech-recognition systems with systems based on deep learning methods.
Optional Textbooks
In this course, students will learn state-of-the-art deep learning methods for NLP. Through lectures and practical assignments, students will learn the necessary tricks for making their models work on practical problems. They will learn to implement, and possibly to invent their own deep learning models using available deep learning libraries like Pytorch.
Our Approach
Thorough and Detailed: How to write from scratch, debug and train deep neural models
State of the art: Most lecture materials are new from research world in the past 1-5 years.
Practical: Focus on practical techniques for training the models, and on GPUs.
Fun: Cover exciting new advancements in NLP (e.g., Transformer, BERT).
Weekly Workload
NO office hour.5% marks for class participation.Assignments (individually graded)
3 * 15% = 45% of the total assessment.10% off per day lateFinal Project (Group work but individually graded)
1–3 people per group5%, update: 5%, presentation: 10%, report: 30%No Lecture
Project Proposal Instructions in NTU Learn (inside Content)
Project Proposal due
Assignment 2 in
Lecture Content
Other applications of recursive NN
Modern parsers
Practical exercise with Pytorch
Suggested Readings
Lecture Content
Assignment 2 out
Practical exercise with Pytorch
Suggested Readings
Lecture Content
Assignment 1 in
Practical exercise with Pytorch
CNN for NER
Suggested Readings
Lecture Content
Evaluating word vectors
Cross-lingual word vectors
Practical exercise with Pytorch
Visualization
Suggested Readings
Improving Distributional Similarity with Lessons Learned from Word Embeddings
Linear Algebraic Structure of Word Senses, with Applications to Polysemy
Lecture Content
Why Deep Learning for NLP?
From Logistic Regression to Feed-forward NN
SGD with Backpropagation
Adaptive SGD (Adagrad, adam, RMSProp)
Regularization (Weight Decay, Dropout, Batch normalization, Gradient clipping)
Introduction to Word Vectors
Assignment 1 out
Practical exercise with Pytorch
Numpy notebook Pytorch notebook
Suggested Readings
Lecture Content
Practical exercise with Pytorch
[Supplementary]
Lecture Content
Python & PyTorch Basics
Programming in Python
[Supplementary]