Machine Learning


Machine Learning

Machine learning is the science of getting computers to act without being explicitly programmed. In the past decade, machine learning has given us self-driving cars, practical speech recognition, effective web search, and a vastly improved understanding of the human genome. Machine learning is so pervasive today that you probably use it dozens of times a day without knowing it. Many researchers also think it is the best way to make progress towards human-level AI.

Target Audience

This course is ideal for individuals having interest in Artificial Intelligence, Machine Learning, or Deep Learning. In other words, this course is meaningful for Python Developers, Robotics Engineer, and Fresh Graduates.


As such, there are no technical prerequisites for machine learning. However, knowledge of python and mathematical aptitude will be beneficial. 

Course Objectives

A form of artificial intelligence, machine learning is revolutionizing the world of computing as well as all people’s digital interactions. By making it possible to quickly, cheaply and automatically process and analyse huge volumes of complex data, machine learning is critical to countless new and future applications. Machine learning powers such innovative automated technologies as recommendation engines, facial recognition, fraud protection and even self-driving cars.This Machine Learning course prepares engineers, data scientists and other professionals with knowledge and hands-on skills required for certification and job competency in machine learning.  

Course Curriculum

Section 1 : Introduction
What is Machine Learning?What is Supervised LearningWhat is Unsupervised Learning
Section 2 : Linear Regression with One Variable
Model RepresentationCost FunctionGradient DescentGradient Descent for Linear Regression
Section 3 : Linear Algebra Review
Matrices and VectorsAddition and Scalar MultiplicationMatrix Vector MultiplicationMatrix Multiplication PropertiesInverse and Transpose
Section 4 : Linear Regression with Multiple Variables
Setting Up Your Programming Assignment EnvironmentInstalling Octave on WindowsInstalling Octave on Mac OS XInstalling Octave on GNU/LinuxMore Octave/MATLAB resourcesMultiple FeaturesGradient Descent for Multiple VariablesFeatures and Polynomial RegressionNormal EquationNormal Equation NoninvertibilityWorking on and Submitting Programming Assignments
Section 5 : Octave/Matlab Tutorial
Basic OperationsMoving Data AroundComputing on DataPlotting DataControl Statements: for, while, if statementVectorization
Section 6 : Logistic Regression
ClassificationHypothesis RepresentationDecision BoundaryCost FunctionSimplified Cost Function and Gradient DescentAdvanced OptimizationMulticlass Classification: One-vs-all
Section 7 : Neural Networks: Representation
 Non-linear HypothesesNeurons and the BrainModel RepresentationExamples and IntuitionsMulticlass ClassificationMulti-class Classification and Neural Networks
Section 8 : Neural Networks: Learning
Cost FunctionBackpropagation AlgorithmBackpropagation IntuitionImplementation Note: Unrolling ParametersGradient CheckingRandom InitializationAutonomous Driving
Section 9 : Advice for Applying Machine Learning
Evaluating a HypothesisModel Selection and Train/Validation/Test SetsDiagnosing Bias vs. VarianceRegularization and Bias/VarianceLearning CurvesDeciding What to Do Next RevisitedRegularized Linear Regression and Bias/Variance
Section 10 : Machine Learning System Design
Prioritizing What to Work OnError AnalysisError Metrics for Skewed ClassesTrading Off Precision and RecallData For Machine Learning
Section 11 : Support Vector Machines
 Optimization ObjectiveLarge Margin IntuitionMathematics Behind Large Margin ClassificationKernelsUsing An SVMSupport Vector Machines
Section 12 : Unsupervised Learning
Unsupervised LearningK-Means AlgorithmOptimization ObjectiveRandom InitializationChoosing the Number of Clusters
Section 13 : Dimensionality Reduction
Data CompressionMotivation II: VisualizationPrincipal Component Analysis Problem FormulationPrincipal Component Analysis AlgorithmReconstruction from Compressed RepresentationChoosing the Number of Principal ComponentsAdvice for Applying PCAK-Means Clustering and PCA
Section 14 : Anomaly Detectiont
Problem MotivationGaussian DistributionAlgorithmDeveloping and Evaluating an Anomaly Detection SystemAnomaly Detection vs. Supervised LearningChoosing What Features to UseMultivariate Gaussian DistributionAnomaly Detection using the Multivariate Gaussian Distribution
Section 15 : Recommender Systems
Problem FormulationContent Based RecommendationsCollaborative FilteringCollaborative Filtering AlgorithmVectorization: Low Rank Matrix FactorizationImplementational Detail: Mean NormalizationAnomaly Detection and Recommender Systems
Section 16 : Large Scale Machine Learning
 Learning With Large DatasetsStochastic Gradient DescentMini-Batch Gradient DescentStochastic Gradient Descent ConvergenceMap Reduce and Data Parallelism
Section 17 : Application Example
 Problem Description and PipelineSliding WindowsGetting Lots of Data and Artificial Data