課件英文視頻及字幕等 by 愛可可-愛生活

鏈接:https://pan.baidu.com/s/1pKsTivp#list

官網(wǎng)

鏈接:CS231n: Convolutional Neural Networks for Visual Recognition

深度學(xué)習(xí)與計(jì)算機(jī)視覺-非常經(jīng)典人工智能課程

深度學(xué)習(xí)與計(jì)算機(jī)視覺-非常經(jīng)典人工智能課程-

These notes accompany the Stanford CS class CS231n: Convolutional Neural Networks for Visual Recognition. 

For questions/concerns/bug reports contact Justin Johnson regarding the assignments, or contact Andrej Karpathy regarding the course notes. You can also submit a pull request directly to our git repo. 

We encourage the use of the hypothes.is extension to annote comments and discuss these notes inline.

Spring 2019 Assignments

Assignment #1: Image Classification, kNN, SVM, Softmax, Neural Network

Assignment #2: Fully-Connected Nets, Batch Normalization, Dropout, Convolutional Nets

Assignment #3: Image Captioning with Vanilla RNNs, Image Captioning with LSTMs, Network Visualization, Style Transfer, Generative Adversarial Networks

Module 0: Preparation

Setup Instructions

Python / Numpy Tutorial

IPython Notebook Tutorial

Google Cloud Tutorial

AWS Tutorial

Module 1: Neural Networks

Image Classification: Data-driven Approach, k-Nearest Neighbor, train/val/test splits

L1/L2 distances, hyperparameter search, cross-validation

Linear classification: Support Vector Machine, Softmax

parameteric approach, bias trick, hinge loss, cross-entropy loss, L2 regularization, web demo

Optimization: Stochastic Gradient Descent

optimization landscapes, local search, learning rate, analytic/numerical gradient

Backpropagation, Intuitions

chain rule interpretation, real-valued circuits, patterns in gradient flow

Neural Networks Part 1: Setting up the Architecture

model of a biological neuron, activation functions, neural net architecture, representational power

Neural Networks Part 2: Setting up the Data and the Loss

preprocessing, weight initialization, batch normalization, regularization (L2/dropout), loss functions

Neural Networks Part 3: Learning and Evaluation

gradient checks, sanity checks, babysitting the learning process, momentum (+nesterov), second-order methods, Adagrad/RMSprop, hyperparameter optimization, model ensembles

Putting it together: Minimal Neural Network Case Study

minimal 2D toy data example

Module 2: Convolutional Neural Networks

Convolutional Neural Networks: Architectures, Convolution / Pooling Layers

layers, spatial arrangement, layer patterns, layer sizing patterns, AlexNet/ZFNet/VGGNet case studies, computational considerations

Understanding and Visualizing Convolutional Neural Networks

tSNE embeddings, deconvnets, data gradients, fooling ConvNets, human comparisons

Transfer Learning and Fine-tuning Convolutional Neural Networks

中文字幕

郵箱
huangbenjincv@163.com

连山| 牡丹江市| 调兵山市| 瓦房店市| 永寿县| 南京市| 陵水| 伊川县| 陆丰市| 乌苏市| 固始县| 洛南县| 兰坪| 巴马| 西丰县| 梅河口市| 宜良县| 杭锦后旗| 安宁市| 仁布县| 深圳市| 五华县| 巴东县| 普安县| 双峰县| 临湘市| 宜都市| 封丘县| 拜泉县| 武邑县| 新河县| 洛浦县| 五指山市| 双鸭山市| 富裕县| 长汀县| 伽师县| 招远市| 泸定县| 宜兴市| 灵宝市|