Machine Learning


Machine Learning

Machine learning is the science of getting computers to act without being explicitly programmed. In the past decade, machine learning has given us self-driving cars, practical speech recognition, effective web search, and a vastly improved understanding of the human genome. Machine learning is so pervasive today that you probably use it dozens of times a day without knowing it. Many researchers also think it is the best way to make progress towards human-level AI.

Target Audience

  • This course is ideal for individuals having interest in Artificial Intelligence, Machine Learning, or Deep Learning. In other words, this course is meaningful for Python Developers, Robotics Engineer, and Fresh Graduates.


  • As such, there are no technical prerequisites for machine learning. However, knowledge of python and mathematical aptitude will be beneficial. 

Course Objectives

  • A form of artificial intelligence, machine learning is revolutionizing the world of computing as well as all people’s digital interactions. By making it possible to quickly, cheaply and automatically process and analyse huge volumes of complex data, machine learning is critical to countless new and future applications. Machine learning powers such innovative automated technologies as recommendation engines, facial recognition, fraud protection and even self-driving cars.

    This Machine Learning course prepares engineers, data scientists and other professionals with knowledge and hands-on skills required for certification and job competency in machine learning.



Course Curriculum

Section 1 : Introduction

  • What is Machine Learning?
  • What is Supervised Learning
  • What is Unsupervised Learning

Section 2 : Linear Regression with One Variable

  • Model Representation
  • Cost Function
  • Gradient Descent
  • Gradient Descent for Linear Regression

Section 3 : Linear Algebra Review

  • Matrices and Vectors
  • Addition and Scalar Multiplication
  • Matrix Vector Multiplication
  • Matrix Multiplication Properties
  • Inverse and Transpose

Section 4 : Linear Regression with Multiple Variables

  • Setting Up Your Programming Assignment Environment
  • Installing Octave on Windows
  • Installing Octave on Mac OS X
  • Installing Octave on GNU/Linux
  • More Octave/MATLAB resources
  • Multiple Features
  • Gradient Descent for Multiple Variables
  • Features and Polynomial Regression
  • Normal Equation
  • Normal Equation Noninvertibility
  • Working on and Submitting Programming Assignments

Section 5 : Octave/Matlab Tutorial

  • Basic Operations
  • Moving Data Around
  • Computing on Data
  • Plotting Data
  • Control Statements: for, while, if statement
  • Vectorization

Section 6 : Logistic Regression

  • Classification
  • Hypothesis Representation
  • Decision Boundary
  • Cost Function
  • Simplified Cost Function and Gradient Descent
  • Advanced Optimization
  • Multiclass Classification: One-vs-all

Section 7 : Neural Networks: Representation

  •  Non-linear Hypotheses
  • Neurons and the Brain
  • Model Representation
  • Examples and Intuitions
  • Multiclass Classification
  • Multi-class Classification and Neural Networks

Section 8 : Neural Networks: Learning

  • Cost Function
  • Backpropagation Algorithm
  • Backpropagation Intuition
  • Implementation Note: Unrolling Parameters
  • Gradient Checking
  • Random Initialization
  • Autonomous Driving

Section 9 : Advice for Applying Machine Learning

  • Evaluating a Hypothesis
  • Model Selection and Train/Validation/Test Sets
  • Diagnosing Bias vs. Variance
  • Regularization and Bias/Variance
  • Learning Curves
  • Deciding What to Do Next Revisited
  • Regularized Linear Regression and Bias/Variance

Section 10 : Machine Learning System Design

  • Prioritizing What to Work On
  • Error Analysis
  • Error Metrics for Skewed Classes
  • Trading Off Precision and Recall
  • Data For Machine Learning

Section 11 : Support Vector Machines

  •  Optimization Objective
  • Large Margin Intuition
  • Mathematics Behind Large Margin Classification
  • Kernels
  • Using An SVM
  • Support Vector Machines

Section 12 : Unsupervised Learning

  • Unsupervised Learning
  • K-Means Algorithm
  • Optimization Objective
  • Random Initialization
  • Choosing the Number of Clusters

Section 13 : Dimensionality Reduction

  • Data Compression
  • Motivation II: Visualization
  • Principal Component Analysis Problem Formulation
  • Principal Component Analysis Algorithm
  • Reconstruction from Compressed Representation
  • Choosing the Number of Principal Components
  • Advice for Applying PCA
  • K-Means Clustering and PCA

Section 14 : Anomaly Detectiont

  • Problem Motivation
  • Gaussian Distribution
  • Algorithm
  • Developing and Evaluating an Anomaly Detection System
  • Anomaly Detection vs. Supervised Learning
  • Choosing What Features to Use
  • Multivariate Gaussian Distribution
  • Anomaly Detection using the Multivariate Gaussian Distribution

Section 15 : Recommender Systems

  • Problem Formulation
  • Content Based Recommendations
  • Collaborative Filtering
  • Collaborative Filtering Algorithm
  • Vectorization: Low Rank Matrix Factorization
  • Implementational Detail: Mean Normalization
  • Anomaly Detection and Recommender Systems

Section 16 : Large Scale Machine Learning

  •  Learning With Large Datasets
  • Stochastic Gradient Descent
  • Mini-Batch Gradient Descent
  • Stochastic Gradient Descent Convergence
  • Map Reduce and Data Parallelism

Section 17 : Application Example

  •  Problem Description and Pipeline
  • Sliding Windows
  • Getting Lots of Data and Artificial Data