課件英文視頻及字幕等 by 愛可可-愛生活

鏈接:https://pan.baidu.com/s/1pKsTivp#list

官網(wǎng)

鏈接:CS231n: Convolutional Neural Networks for Visual Recognition

深度學(xué)習(xí)與計(jì)算機(jī)視覺-非常經(jīng)典人工智能課程

深度學(xué)習(xí)與計(jì)算機(jī)視覺-非常經(jīng)典人工智能課程-

These notes accompany the Stanford CS class CS231n: Convolutional Neural Networks for Visual Recognition. 

For questions/concerns/bug reports contact Justin Johnson regarding the assignments, or contact Andrej Karpathy regarding the course notes. You can also submit a pull request directly to our git repo. 

We encourage the use of the hypothes.is extension to annote comments and discuss these notes inline.

Spring 2019 Assignments

Assignment #1: Image Classification, kNN, SVM, Softmax, Neural Network

Assignment #2: Fully-Connected Nets, Batch Normalization, Dropout, Convolutional Nets

Assignment #3: Image Captioning with Vanilla RNNs, Image Captioning with LSTMs, Network Visualization, Style Transfer, Generative Adversarial Networks

Module 0: Preparation

Setup Instructions

Python / Numpy Tutorial

IPython Notebook Tutorial

Google Cloud Tutorial

AWS Tutorial

Module 1: Neural Networks

Image Classification: Data-driven Approach, k-Nearest Neighbor, train/val/test splits

L1/L2 distances, hyperparameter search, cross-validation

Linear classification: Support Vector Machine, Softmax

parameteric approach, bias trick, hinge loss, cross-entropy loss, L2 regularization, web demo

Optimization: Stochastic Gradient Descent

optimization landscapes, local search, learning rate, analytic/numerical gradient

Backpropagation, Intuitions

chain rule interpretation, real-valued circuits, patterns in gradient flow

Neural Networks Part 1: Setting up the Architecture

model of a biological neuron, activation functions, neural net architecture, representational power

Neural Networks Part 2: Setting up the Data and the Loss

preprocessing, weight initialization, batch normalization, regularization (L2/dropout), loss functions

Neural Networks Part 3: Learning and Evaluation

gradient checks, sanity checks, babysitting the learning process, momentum (+nesterov), second-order methods, Adagrad/RMSprop, hyperparameter optimization, model ensembles

Putting it together: Minimal Neural Network Case Study

minimal 2D toy data example

Module 2: Convolutional Neural Networks

Convolutional Neural Networks: Architectures, Convolution / Pooling Layers

layers, spatial arrangement, layer patterns, layer sizing patterns, AlexNet/ZFNet/VGGNet case studies, computational considerations

Understanding and Visualizing Convolutional Neural Networks

tSNE embeddings, deconvnets, data gradients, fooling ConvNets, human comparisons

Transfer Learning and Fine-tuning Convolutional Neural Networks

中文字幕

郵箱
huangbenjincv@163.com

建宁县| 清原| 霍城县| 新乐市| 松阳县| 来宾市| 潞城市| 黄山市| 恩平市| 进贤县| 原阳县| 沈丘县| 沂南县| 电白县| 牡丹江市| 华安县| 吕梁市| 南安市| 家居| 平安县| 马公市| 铜川市| 奉贤区| 吴桥县| 兴海县| 青田县| 乌鲁木齐市| 琼海市| 手机| 仁怀市| 通州市| 炉霍县| 长春市| 莒南县| 冕宁县| 鹤庆县| 怀仁县| 石泉县| 扎鲁特旗| 渝北区| 南昌市|