W1 - Practical Aspects of Deep Learning
Explore practical aspects of deep learning: initialization methods, regularization techniques like dropout, optimization, and gradient checking for neural networks.
https://www.coursera.org/learn/deep-neural-network
In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically.
By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow.
SKILLS YOU WILL GAIN
Discover and experiment with a variety of different initialization methods, apply L2 regularization and dropout to avoid model overfitting, then apply gradient checking to identify errors in a fraud detection model.
Learning Objectives
Develop your deep learning toolbox by adding more advanced optimizations, random minibatching, and learning rate decay scheduling to speed up your models.
Learning Objectives
Explore TensorFlow, a deep learning framework that allows you to build neural networks quickly and easily, then train a neural network on a TensorFlow dataset.
Learning Objectives
Explore practical aspects of deep learning: initialization methods, regularization techniques like dropout, optimization, and gradient checking for neural networks.
Advance your deep learning skills with optimization algorithms like Gradient Descent with Momentum, RMSprop, and Adam. Learn techniques like minibatching, learning rate decay, and understand the real issue with local optima in high-dimensional spaces.
Explore TensorFlow for neural network development, hyperparameter tuning, and batch normalization. Master the art of tuning, normalize activations, and understand Softmax classification. Dive into deep learning frameworks and let TensorFlow handle backpropagation.