Coursera Deep Learning Specialization, Andrew Ng – 목차

COURSERA Deep Learning Specialization 과정의 목차를 공유한다. 해당 강의는 Andrew Ng 교수님께서 강의하시는 내용으로 총 5개의 Course(강좌) 로 구성되어 있다. 각 Course 별 수강 후, 짧막한 리뷰 내용은 다음 포스트를 참고 하기 바란다.

( ※ 목차는 누구에게나 공개되어 있으나, 혹시라도 저작권 문제가 있을경우, 알려주시면 아래 게시물은 내리도록 하겠습니다. )

 

Course1. Neural Networks and Deep Learning

Week1. Introduction to deep learning

Welcome to the Deep Learning Specialization
– Welcome

Introduction to Deep Learning
– What is neural network?
– Supervised Learning with Neural Networks
– Why is Deep Learning taking off?
– About this Course
– Course Resources

Heroes of Deep Learning
– Geoffrey Hinton interview

Week2. Neural Networks Basics

Logistic Regression as a Neural Network
– Binary Classification
– Logistic Regression
– Logistic Regression Cost Function
– Gradient Descent
– Derivatives
– More Derivative Examples
– Computation Graph
– Derivatives with a Computation Graph
– Logistic Regression Gradient Descent
– Gradient Descent on m Examples

Python and Vectorization
– Vectorization
– More Vectorization Examples
– Vectorizing Logistic Regression
– Vectorizing Logistic Regression’s Gradient Output
– Broadcasting in Python
– A note on python/numpy vectors
– Quick tour of Jupyter/iPython Notebooks
– Explanation of logistic regression cost function (optional)

Programming Assignments
– Python Basics with numpy(optional)
– Logistic Regression with a Neural Network mindset

Heroes of Deep Learning
Pieter Abbeel interview

Week3. Shallow Neural Networks

– Neural Networks Overview
– Neural Network Representation
– Computing a Neural Network’s Output
– Vectorizing across multiple examples
– Explanation for Vectorized Implementation
– Activation functions
– Why do you need non-linear activation functions?
– Derivatives of activation functions
– Gradient descent for Neural Networks
– Backpropagation intuition (optional)
– Random Initialization

Programming Assignment
– Planar data classification with a hidden layer

Heroes of Deep Learning
– Ian Goodfellow interview

Week4. Deep Neural Network

– Deep L-layer neural network
– Forward Propagation in a Deep Network
– Getting your matrix dimensions right
– Why deep representations?
– Building blocks of deep neural networks
– Forward and Backward Propagation
– Parameters vs Hyperparameters
– What does this have to do with the brain?

Programming Assignments
– Building your Deep Neural Networks : Step by Step
– Deep Neural Network – Application

 

Course2. Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization

Week1. Practical aspects of Deep Learning

Setting up your Machine Learning Application
– Train/ Dev/ Test sets
– Bias/ Variance
– Basic Recipe for Machine Learning

Regularizing your neural network
– Regularization
– Why regularization reduces overfitting?
– Dropout Regularization
– Understanding Dropout
– Other regularization methods

Setting up your optimization problem
– Normalizing inputs
– Vanishing / Exploding gradients
– Weight Initialization for Deep Networks
– Numerical approximation of gradients
– Gradient checking
– Gradient Checking Implementation Notes

Progamming Assignments
– Initialization
– Regularization
– Gradient Checking

Heroes of Deep Learning
– Yoshua Bengio interview

Week2. Optimization algorithms

– Mini-batch gradient descent
– Understanding mini-batch gradient descent
– Exponentially weighted averages
– Understanding exponentially weighted averages
– Bias correction in exponentially weighted averages
– Gradient descent with momentum
– RMSprop
– Adam optimization algorithm
– Learning rate decay
– The problem of local optima

Programming Assignment
– Optimization

Heroes of Deep Learning
– Yuanqing Lin interview

Week3. Hyperparameter tuning, Batch Normalization and Programming Frameworks

Hyperparameter tuning
– Tuning process
– Using an appropriate scale to pick hyperparameters
– Hyperparameters tuning in practice: Pandas vs. Caviar

Batch Normalization
– Normalizing activations in a network
– Fitting Bach Norm into a neural network
– Why does Batch Norm work?
– Batch Norm at test time

Multi-class classification
– Softmax Regression
– Training a softmax classifier

Introduction to programming frameworks
– Deep learning frameworks
– TensorFlow

Programming Assignment
– Tensorflow

 

Course3. Structuring Machine Learning Projects

Week1. ML Strategy (1)

Introduction to ML Strategy
– Why ML Strategy
– Orthogonalization

Setting up your goal
– Single number evaluation metric
– Satisficing and Optimizing metric
– Train/ dev / test distributions
– Size of the dev and test sets
– When to change dev/ test sets and metrics

Comparing to human-level performance
– Why human-level performance?
– Avoidable bias
– Understanding human-level performance
– Surpassing human-level performance
– Improving your model performance

Machine Learning flight simulator
– Bird recognition in the city of Peachtopia (case study)

Heroes of Deep Learning
– Andrej Karpathy interview

Week2. ML Strategy (2)

Error Analysis
– Carrying out error analysis
– Cleaning up incorrectly labeled data
– Build your first system quickly, then iterate

Mismatched training and dev/test set
– Training and testing on different distributions
– Bias and Variance with mismatched data distributions
– Addressing data mismatch

Learning from multiple tasks
– Transfer learning
– Multi-task learning

End-to-end deep learning
– What is end-to-end deep learning?
– Wheter to use end-to-end deep learning

Machine Learning flight simulator
– Autonomous driving (case study)

Heroes of Deep Learning
– Ruslan Salakhutdinov interview

 

Course4. Convolutional Neural Networks

Week1. Foundations of Convolutional Neural Networks

Convolutional Neural Networks
– Computer Vision
– Edge Detection Example
– More Edge Detection
– Padding
– Strided Convolutions
– Convolutions Over Volume
– One Layer of a Convolutional Network
– Simple Convolutional Network Example
– Pooling Layers
– CNN Example
– Why Convolutions?

Programming Assignments
– Convolutional Model : Step by Step
– Convolutional Model : Application

Heroes of Deep Learning
– Yann LeCun Interview

Week2. Deep convolutional models: case studies

Case studies
– Why look at case studies?
– Classic Networks
– ResNets
– Why ResNets Work
– Networks in Networks and 1×1 Convolutions
– Inception Network Motivation
– Inception Network

Practical advices for using ConvNets
– Using Open-Source Implementation
– Transfer Learning
– Data Augmentation
– State of Computer Vision

Programming Assignments
– Keras Tutorial – The Happy House (not graded)
– Residual Networks

Week3. Object detection

Detection algorithms
– Object Localization
– Landmark Detection
– Object Detection
– Convolutional Implementation of Sliding Windows
– Bounding Box Predictions
– Intersection Over Union
– Non-max Suppression
– Anchor Boxes
– YOLO Algorithm
– (Optional) Region Proposals

Programming Assignments
– Car detection with YOLOv2

Week4. Special applications: Face recognition & Neural style transfer

Face Recognition
– What is face recognition?
– One Shot Learning
– Siamese Network
– Triplet Loss
– Face Verification and Binary Classification

Neural Style Transfer
– What is neural style transfer?
– What are deep ConvNets learning?
– Cost Function
– Content Cost Function
– Style Cost Function
– 1D and 3D Generalizations

Programming Assignments
– Art generation with Neural Style Transfer
– Face Recognition for the Happy House

 

Course5. Sequence Models

Week1. Recurrent Neural Networks

Recurrent Neural Networks
– Why sequence models
– Notation
– Recurrent Neural Network Model
– Backpropagation through time
– Different types of RNNs
– Language model and sequence generation
– Sample novel sequences
– Vanishing gradients with RNNs
– Gated Recurrent Unit (GRU)
– Long Short Term Memory (LSTM)
– Bidirectional RNN
– Deep RNNs

Programming Assignments
– Building a recurrent neural network : step by step
– Dinosaur Island – Character-Level Language Modeling
– Jazz improvisation with LSTM

Week2. Natural Language Processing & Word Embeddings

Introduction to Word Embeddings
– Word Representation
– Using word embeddings
– Properties of word embeddings
– Embedding matrix

Learning Word Embeddings : Word2vec & GloVe
– Learning word embeddings
– Word2Vec
– Negative Sampling
– GloVe word vectors

Applications using Word Embeddings
– Sentiment Classification
– Debiasing word embeddings

Programming Assignments
– Operations on word vectors – Debiasing
– Emojify

Week3. Sequence models & Attention mechanism

Various sequence to sequence architectures
– Basic Models
– Picking the most likely sentence
– Beam Search
– Refinements to Beam Search
– Error analysis in beam search
– Bleu Score (optional)
– Attention Model Intuition
– Attention Model

Speech recognition-Audio data
– Speech recognition
– Trigger Word Detection

Conclusion
– Conclusion and thank you

Programming Assignments
– Neural Machine Translation with Attention
– Trigger word detection

Leave a Comment