# Overfitting! Unlocking the Last Key Concept in Supervised Machine Learning – Day 11, 12
I finished the course!
I finished the course!
Today is mainly learning about “Decision boundary”, “Cost function of logistic regresion”, “Logistic loss” and “Gradient Descent Implementation for logistic regression”.
Oh boy… I was sick for almost two weeks 🤒 After a brief break, I’m back to dive deep into machine learning, and today, we’ll revisit one of the core concepts in training models—gradient descent.…
Today I started with Choosing the learning rate, reviewed the Jupyter lab and learnt what is feature engineering.
I didn’t get much time working on the course in past 5 days!!!
Today I spent 10 30 60 mins reviewing previous notes, just realized that’s a lot.
Day 4 was a long day for me, just got 15mins before bed to quickly skim through the video “multiple features” and “vectorization part 1”.
Day 2 I was busy and managed only 15 mins for “Supervised Machine Learning” video + 15 mins watched But what is a neural network? | Chapter 1, Deep learning from 3blue1brown.