W1 - Neural Networks
Dive into neural networks with TensorFlow and Python, learning concepts like layers, forward propagation, and vectorization for efficient computations.
In the second course of the Machine Learning Specialization, you will:
It provides a broad introduction to modern machine learning, including supervised learning (multiple linear regression, logistic regression, neural networks, and decision trees), unsupervised learning (clustering, dimensionality reduction, recommender systems), and some of the best practices used in Silicon Valley for artificial intelligence and machine learning innovation (evaluating and tuning models, taking a data-centric approach to improving performance, and more.)
This week, you’ll learn about neural networks and how to use them for classification tasks. You’ll use the TensorFlow framework to build a neural network with just a few lines of code. Then, dive deeper by learning how to code up your own neural network in Python, “from scratch”. Optionally, you can learn more about how neural network computations are implemented efficiently using parallel processing (vectorization).
Learning Objectives
This week, you’ll learn how to train your model in TensorFlow, and also learn about other important activation functions (besides the sigmoid function), and where to use each type in a neural network. You’ll also learn how to go beyond binary classification to multiclass classification (3 or more categories). Multiclass classification will introduce you to a new activation function and a new loss function. Optionally, you can also learn about the difference between multiclass classification and multi-label classification. You’ll learn about the Adam optimizer, and why it’s an improvement upon regular gradient descent for neural network training. Finally, you will get a brief introduction to other layer types besides the one you’ve seen thus far.
Learning Objectives
This week you’ll learn best practices for training and evaluating your learning algorithms to improve performance. This will cover a wide range of useful advice about the machine learning lifecycle, tuning your model, and also improving your training data.
Learning Objectives
This week, you’ll learn about a practical and very commonly used learning algorithm the decision tree. You’ll also learn about variations of the decision tree, including random forests and boosted trees (XGBoost).
Learning Objectives
Dive into neural networks with TensorFlow and Python, learning concepts like layers, forward propagation, and vectorization for efficient computations.
Discover neural network training, activation functions, multiclass classification, and advanced optimization techniques in TensorFlow.
Gain insights on machine learning best practices, including evaluating models, diagnosing bias and variance, and leveraging transfer learning to enhance model performance.
Explore decision trees, from the basics of their structure and learning process to advanced concepts like tree ensembles, random forests, and XGBoost.