A brief introduction Day 2 I was busy and managed only 15 mins for "Supervised Machine Learning" video + 15 mins watched But what is a neural network? | Chapter 1, Deep learning from 3blue1brown. Day 3 I managed 30+ mins on "Supervised Machine Learning", and some time spent on articles reading, like Parul Pandey's Understanding the Mathematics behind Gradient Descent, it's really good. I like math 😂 So this notes are mixed. Notes Implementing gradient descent Notation I was struggling in writting Latex for the formula, then found this table is useful (source is here): Andrew said I don't need to worry about the derivative and calculas at all, I trust him, next I dived into my bookcases and found out my advanced mathmatics books used in college, and spent 15 mins to review, yes, I don't need. Snapped two epic shots of my "Advanced Mathematics" book used in my college time to show off my killer skills in derivative and calculus - pretty sure I've unlocked Math Wizard status! Reading online Okay. Reading some online articles. If we are able to compute the derivative of a function, we know in which direction to proceed to minimize it (for the cost function). From Parul Pandey, Understanding the Mathematics behind Gradient Descent Parul briefly introduced Power rule and Chain rule, fortunately, I still remember them learnt from colleage. I am so proud. After reviewing various explanations of gradient descent, I truly appreciate Andrew's straightforward and precise approach! He was kiddish some times drawing a stick man walking down a hill…