課程目錄

    本課程主要講解如何利用深度學(xué)習(xí)算法來(lái)解決各種實(shí)際應(yīng)用場(chǎng)景問題,學(xué)生學(xué)習(xí)如何使用這些深度學(xué)習(xí)算法,以及為什么要使用這些算法。本課程希望學(xué)生在課堂上學(xué)習(xí)理論,并通過做作業(yè)和最后的項(xiàng)目來(lái)學(xué)習(xí)實(shí)施方法。 注意:如果已修過類似的課程,例如,李宏毅老師的課程,則無(wú)需修此課程。

課程涵蓋了深度學(xué)習(xí)和表示學(xué)習(xí)中的最新技術(shù),重點(diǎn)包括監(jiān)督/自監(jiān)督學(xué)習(xí)、嵌入方法、度量學(xué)習(xí)、卷積網(wǎng)絡(luò)和循環(huán)網(wǎng)絡(luò),并應(yīng)用于計(jì)算機(jī)視覺、自然語(yǔ)言理解和語(yǔ)音識(shí)別。

This course is enable students to learn how and why to apply deep learning to tackle various practical problems, where the students are expected to learn the theory during the class and learn the implementation by doing assignments and final projects.

Lecture 02019/02/19Course Logistics [slides]


Registration: [Google Form]

Lecture 12019/02/26Introduction [slides] (video)

Guest Lecture (R103)[PyTorch Tutorial]

Lecture 22019/03/05Neural Network Basics [slides] (video)

Suggested Readings:

[Linear Algebra]

[Linear Algebra Slides]

[Linear Algebra Quick Review]

A12019/03/05A1: Dialogue Response Selection[A1 pages]

Lecture 32019/03/12Backpropagation [slides] (video)

Word Representation [slides] (video)

Suggested Readings:

[Learning Representations]

[Vector Space Models of Semantics]

[RNNLM: Recurrent Neural Nnetwork Language Model]

[Extensions of RNNLM]

[Optimzation]

Lecture 42019/03/19Recurrent Neural Network [slides] (video)

Basic Attention [slides] (video)

Suggested Readings:

[RNN for Language Understanding]

[RNN for Joint Language Understanding]

[Sequence-to-Sequence Learning]

[Neural Conversational Model]

[Neural Machine Translation with Attention]

[Summarization with Attention]

[Normalization]

A22019/03/19A2: Contextual Embeddings[A2 pages]

Lecture 52019/03/26Word Embeddings [slides] (video)

Contextual Embeddings - ELMo [slides] (video)

Suggested Readings:

[Estimation of Word Representations in Vector Space]

[GloVe: Global Vectors for Word Representation]

[Sequence Tagging with BiLM]

[Learned in Translation: Contextualized Word Vectors]

[ELMo: Embeddings from Language Models]

[More Embeddings]

2019/04/02Spring BreakA1 Due

Lecture 62019/04/09Transformer [slides] (video)


Contextual Embeddings - BERT [slides] (video)


Gating Mechanism [slides] (video)

Suggested readings:

[Contextual Word Representations Introduction]

[Attention is all you need]

[BERT: Pre-training of Bidirectional Transformers]

[GPT: Improving Understanding by Unsupervised Learning]

[Long Short-Term Memory]

[Gated Recurrent Unit]

[More Transformer]

Lecture 72019/04/16Reinforcement Learning Intro [slides] (video)

Basic Q-Learning [slides] (video)

Suggested Readings:

[Reinforcement Learning Intro]

[Stephane Ross' thesis]

[Playing Atari with Deep Reinforcement Learning]

[Deep Reinforcement Learning with Double Q-learning]

[Dueling Network Architectures for Deep Reinforcement Learning]

A32019/04/16A3: RL for Game Playing[A3 pages]

Lecture 82019/04/23Policy Gradient [slides] (video)

Actor-Critic (video)

More about RL [slides] (video)Suggested Readings:

[Asynchronous Methods for Deep Reinforcement Learning]

[Deterministic Policy Gradient Algorithms]

[Continuous Control with Deep Reinforcement Learning]

A2 Due

Lecture 92019/04/30Generative Adversarial Networks [slides] (video)

(Lectured by Prof. Hung-Yi Lee)

Lecture 102019/05/07Convolutional Neural Networks [slides]

A42019/05/07A4: Drawing[A4 pages]

2019/05/14BreakA3 Due

Lecture 112019/05/21Unsupervised Learning [slides]

NLP Examples [slides]

Project Plan [slides]

Special2019/05/28 Company WorkshopRegistration: [Google Form]

2019/06/04BreakA4 Due

Lecture 122019/06/11Project Progress Presentation

Course and Career Discussion

Special2019/06/18Company WorkshopRegistration: [Google Form]

Lecture 132019/06/25Final Presentation


郵箱
huangbenjincv@163.com

巩留县| 沈丘县| 绵阳市| 海兴县| 梓潼县| 富源县| 定兴县| 蕲春县| 洪雅县| 内江市| 城口县| 洛南县| 雷波县| 东方市| 湖南省| 永修县| 桃源县| 定远县| 龙游县| 运城市| 沙田区| 比如县| 惠东县| 嘉定区| 科技| 银川市| 铜梁县| 曲沃县| 许昌市| 江都市| 库尔勒市| 旬阳县| 奉贤区| 合江县| 吴江市| 景宁| 南宁市| 苏尼特左旗| 信阳市| 敦化市| 恩施市|