Section 1 : Introduction

What is Machine Learning?What is Supervised LearningWhat is Unsupervised Learning

Section 2 : Linear Regression with One Variable

Model RepresentationCost FunctionGradient DescentGradient Descent for Linear Regression

Section 3 : Linear Algebra Review

Matrices and VectorsAddition and Scalar MultiplicationMatrix Vector MultiplicationMatrix Multiplication PropertiesInverse and Transpose

Section 4 : Linear Regression with Multiple Variables

Setting Up Your Programming Assignment EnvironmentInstalling Octave on WindowsInstalling Octave on Mac OS XInstalling Octave on GNU/LinuxMore Octave/MATLAB resourcesMultiple FeaturesGradient Descent for Multiple VariablesFeatures and Polynomial RegressionNormal EquationNormal Equation NoninvertibilityWorking on and Submitting Programming Assignments

Section 5 : Octave/Matlab Tutorial

Basic OperationsMoving Data AroundComputing on DataPlotting DataControl Statements: for, while, if statementVectorization

Section 6 : Logistic Regression

ClassificationHypothesis RepresentationDecision BoundaryCost FunctionSimplified Cost Function and Gradient DescentAdvanced OptimizationMulticlass Classification: One-vs-all

Section 7 : Neural Networks: Representation

Non-linear HypothesesNeurons and the BrainModel RepresentationExamples and IntuitionsMulticlass ClassificationMulti-class Classification and Neural Networks

Section 8 : Neural Networks: Learning

Cost FunctionBackpropagation AlgorithmBackpropagation IntuitionImplementation Note: Unrolling ParametersGradient CheckingRandom InitializationAutonomous Driving

Section 9 : Advice for Applying Machine Learning

Evaluating a HypothesisModel Selection and Train/Validation/Test SetsDiagnosing Bias vs. VarianceRegularization and Bias/VarianceLearning CurvesDeciding What to Do Next RevisitedRegularized Linear Regression and Bias/Variance

Section 10 : Machine Learning System Design

Prioritizing What to Work OnError AnalysisError Metrics for Skewed ClassesTrading Off Precision and RecallData For Machine Learning

Section 11 : Support Vector Machines

Optimization ObjectiveLarge Margin IntuitionMathematics Behind Large Margin ClassificationKernelsUsing An SVMSupport Vector Machines

Section 12 : Unsupervised Learning

Unsupervised LearningK-Means AlgorithmOptimization ObjectiveRandom InitializationChoosing the Number of Clusters

Section 13 : Dimensionality Reduction

Data CompressionMotivation II: VisualizationPrincipal Component Analysis Problem FormulationPrincipal Component Analysis AlgorithmReconstruction from Compressed RepresentationChoosing the Number of Principal ComponentsAdvice for Applying PCAK-Means Clustering and PCA

Section 14 : Anomaly Detectiont

Problem MotivationGaussian DistributionAlgorithmDeveloping and Evaluating an Anomaly Detection SystemAnomaly Detection vs. Supervised LearningChoosing What Features to UseMultivariate Gaussian DistributionAnomaly Detection using the Multivariate Gaussian Distribution

Section 15 : Recommender Systems

Problem FormulationContent Based RecommendationsCollaborative FilteringCollaborative Filtering AlgorithmVectorization: Low Rank Matrix FactorizationImplementational Detail: Mean NormalizationAnomaly Detection and Recommender Systems

Section 16 : Large Scale Machine Learning

Learning With Large DatasetsStochastic Gradient DescentMini-Batch Gradient DescentStochastic Gradient Descent ConvergenceMap Reduce and Data Parallelism

Section 17 : Application Example

Problem Description and PipelineSliding WindowsGetting Lots of Data and Artificial Data