b. Optimalityc. Time Complexityd. Space Complexity2. Types of Search Algorithmsa. Breadth First Searchb. Depth First Searchc. Uniform Cost Searchd. Limited Depth First Searche. Iterative Deepening Depth First SearchIII. Lecture 03 - Classical Search - Informed SearchA. DefinitionB. Evaluation Function -f(n)1. Path Cost Functin -g(n)2. Heuristic Function -h(n)C. Best First Search Algorithms1. Definition2. Typesa. Greedy Best First Searchb. A* SearchD. Conditions for Optimality1. Admissibility2. ConsistencyIV. Lecture 04 - Beyond Classical Search - Local SearchA. Local Search1. Definition2. Measures of Local Searcha. Completenessb. Optimality3. Environment - Partially Observable, Deterministic4. Types of Local Searcha. Hill Climbing(1) Strict Hill Climbing(2) Stochastic Hill Climbing(3) First Choice Hill Climbing(4) Randomized Restart Hill Climbingb. Simulated Annealingc. Local Beam Search
d. Genetic AlgorithmsV. Lecture 05 - ML Overview and Univariate Linear RegressionA. Definition of Machine LearningB. Types of Machine LearningC. Univariate Linear Regression1. Form of Model2. Cost Function3. Gradient Descent Algorithma. Partial Derivative Calculationsb. Chain Rule from Calculusc. Convexity and its ImportanceVI. Lecture 06 - Multiple RegressionA. Form of Model1. NotationB. Cost FunctionC. Gradient Descent Algorithm1. Partial Derivative Calculations2. Chain Rule from CalculusD. Vectorization and Broadcasting1. Reason for increased speed of vectorized codeE. Feature ScalingF. Feature EngineeringVII. Lecture 07 - Classification - Logistic RegressionA. Motivating Problem - Binary ClassificationB. Important Constraints1. Problem with using a linear model for classification2. Logistic Function - reasons this this better3. Cost Functiona. Reason we cannot use the squared error functionb. The alternative - the average LossC. Partial Derivatives of the Loss FunctionD. Partial Derivatives of the Cost FunctionE. Gradient Descent Implementation