This is an especially exciting time to study Natural Language Processing (NLP), which aims to enable computers to understand and automatically process human language. This course will focus on NLP fundamentals including language models, automatic syntactic processing and automatic semantic processing, discourse and pragmatics. In addition, this course will also introduce various applications of NLP, including information extraction, sentiment analysis, question and answering, text summarization and machine translation. The students will digest and practice their NLP knowledge and skills by working on programming assignments, in-class quizzes and a final project.
Course GoalThrough this course, students will gain solid theoretical knowledge and enough practical experience to design and develop their own text processing applications in the future.
You should expect for a lot of programming (four of them), an annotation assignment and a final project. In addition, you will be awarded for active class participation, penalized for little participation. Lastly, there is a final exam for this class!
|Four Programming Assignments:||40%|
|The Final Project:||25% (abstract: 5%, presentation+report+code+data: 20%)|
|Final Exam (Dec. 13th, 3:30-5:30 pm):||20%|
The grading policy is as follows:
Attendance and Make-up Policies
Every student should attend the class, unless you have an accepted excuse. Please check student rule 7 http://student-rules.tamu.edu/rule07 for details.
It's important that you work on a real nlp project so that you earn first hand experience of basic text processing and learn to deal with high complexity of human language in concrete applications. You are responsible to develop your project ideas. Then the instructor is available to discuss and shape the project if you like. The scale of the project should be a semester long. By the end of the semester, you should submit your code and data for this project, write a project report of 8 pages at maximum, and prepare a class presentation.
Students should have taken the course Data Structure and Algorithms (CSCE 221).
Textbook and Material
Required textbook: Speech and Language Processing, Daniel Jurafsky and James H. Martin, 2008. Prentice Hall; 2nd edition. Relevant tutorials and papers will also be uploaded to this course page during the class.
"An Aggie does not lie, cheat, or steal or tolerate those who do." For additional information, please visit: http://aggiehonor.tamu.edu.
Upon accepting admission to Texas A&M University, a student immediately assumes a commitment to uphold the Honor Code, to accept responsibility for learning, and to follow the philosophy and rules of the Honor System. Students will be required to state their commitment on examinations, research papers, and other academic work. Ignorance of the rules does not exclude any member of the TAMU community from the requirements or the processes of the Honor System.
Americans with Disabilities Act (ADA) Statement
The Americans with Disabilities Act (ADA) is a federal anti-discrimination statute that provides comprehensive civil rights protection for persons with disabilities. Among other things, this legislation requires that all students with disabilities be guaranteed a learning environment that provides for reasonable accommodation of their disabilities. If you believe you have a disability requiring an accommodation, please contact Disability Services, currently located in the Disability Services building at the Student Services at White Creek complex on west campus or call 979-845-1637. For additional information, visit http://disability.tamu.edu.
|Date||Topic||Slides||Notes and Material|
|08/31||Text Preprocessing and Regular Expressions||slides||p1 out|
|09/05||Text Classification and Naive Bayes||slides||naive bayes text classification|
|09/07||Discriminative Models: MaxEnt,Perceptron||slides|
|09/12||Discriminative Models: MaxEnt, Perceptron cont.||slides||p1 due, p2 out|
|09/14||Language Modeling||slides||Sentence-level LM Discourse Driven LM|
|09/21||Intro to Parts-of-Speech Tagging||slides|
|09/26||Sequence Models||slides||HMM, CRF|
|09/28||Sequence Models cont.||slides|
|10/10||Intro to Parsing||slides||p2 due, annotation assignment out|
|10/12||Statistical Parsing||slides||lexicalized PCFGs|
|10/17||Statistical Parsing cont.||slides||lexicalized PCFGs|
|10/19||Intro to Dependency Parsing||slides||project abstract due|
|10/24||Intro to Semantics||slides||review annotation assignment due, p3 out|
|10/26||Distributional Semantics||slides||word vectors|
|11/02||Semantic Role Labeling||slides||SRL|
|11/07||intro to IE||slides||p3 due|
|11/09||Semantic Lexicon Induction||slides||p4 out, unsupervised NER|
|11/14||Relation Extraction||slides||distant supervision|
|11/16||Discourse, Pragmatics, Coreference Resolution||slides|
|11/23||Thanksgiving Holiday||P4 due|
|11/28||Deep Learning||slides||final term review project due, deep learning for NLP|
|11/30||Final Project Presentations||slides||Peter|
|12/05||Final Project Presentations||slides||Juliang|
|12/07||Reading day, no class|
|12/13||Final Exam, 3:30-5:30 pm|