본문 바로가기

Machine Learning & Deep Learning

Machine Learning Specialization - Supervised Machine Learning : Regression and Classification

반응형

What you will learn

- Build ML models with NumPy & scikit-learn, build & train supervised models for prediction & binary classification tasks (linear, logistic regression)

- Build & train a neural network with TensorFlow to perform multi-class classification, & build & use decision trees & tree ensemble methods

- Apply best practices for ML development & use unsupervised learning techniques for unsupervised learning including clustering & anomaly detection

- Build recommender systems with a collabroative filtering approach & a content-based deep learning method & build a deep reinforcement learning model.

 

Supervised Machine Learning : Regression & Classification

- Build machie learning models in Python using popular machine learning libraries NumPy & scikit-learn

- Build & train supervised machine learning models for prediction & binary classification tasks, including linear regression & logistic regression

- Skills : Linear regression, Regularization to avoid overfitting, Logistic regression for classification, gradient descent, supervised learning 

 

Advanced Learning Algorithms

- Build and train a neural network with TensorFlow to perform multi-class classification

- Apply best practices for machine learning development so that your models generalize to data and tasks in the real world

- Build and use decision trees and tree ensemble methods, including random forests and boosted trees

- Skills : Tensorflow, Advice for model development, artificial neural network, Xgboost, Tree ensembles

 

Unsupervised Learning, Recommenders, Reinforcement Learning

- Use unsupervised learning techniques for unsurvised learning : including clustering and anomaly detection

- Build recommemder systems with a collaborative filtering approach and a content-based deep learning method

- Build a deep reinforcement learning model

- Skills : Anomaly detection, Unsupervised learning, Reinforcement Learning, Collaborative filtering, Recommender systems

 

 

Introduction to Machine Learning

Objective

- Define machine learning

- Define supervised learning

- Define unsupervised learning

- Write and run Python code in Jupyter notebooks

- Define a regression model

- Implement and visualize a cost function

- Implement gradient descent

- Optimize a regression model using gradient descent

 

Overview of Machine Learning

- Welcome to machine learning!

- Applications of machine learning

- Intake Survey

- [Important] Have questions, issues or ideas? Join our forum

 

Supervised vs. Unsupervised Machine Learning

- What is machine learning?

- Supervised learning part1 

- Supervised learning part2

- Unsupervised learning part1

- Unsupervised learning part2

- Jupyter Notebooks

- Python and Jupyter Notebooks

 

Practice Quiz: Supervised vs. unsupervised learning

 

Regression Model

- Linear regression model part1

- Linear regression model part2

- Optional Lab : Model representation

- Cost function formula

- Cost function intuition 

- Visualizing the cost function

- Visualization examples 

- Optional lab : Cost function 

 

Practice Quiz : Regression Model

 

Train the model with gradient descent 

- Gradient descent

- Implementing gradient descent

- Gradient descent intuition

- Learning rate

- Gradient descent for linear regression 

- Running gradient descent 

- Optional lab: Gradient descent

 

Practice quiz : Train the model with gradient descent

 

 

 

 

Regression with multiple input variables

Objective

- Use vectorizaztion to implement multiple linear regression 

- Use feature scaling, feature engineering, and polynomial regression to improve model training 

- Implement linear regression in code

 

Multiple Linear Regression

- Multiple features

- Vectorization part1
- Vectorization part2

- Optional lab: Python, NumPy, and vectorization

- Gradient descent for multiple linear regression

- Optional Lab : Multiple linear regression

 

Practice quiz : Multiple linear regression

 

Gradient descent in practice

- Feature scaling part1

- Feature scaling part2

- Checking gradient descent for convergence 

- Choosing the learning rate

- Optional Lab : Feature scaling and learning rate

- Feature engineering 

- Polynomial regression

- Optional lab: Feature engineering and Polynomial regression

- Optional lab: Linear regression with scikit-learn

 

 

 

 

 

 

반응형
LIST