So difficult to manage some time on this
Day 4 was a long day for me, just got 15mins before bed to quickly skim through the video "multiple features" and "vectorization part 1".
Day 5, a longer day than yesterday... went to urgent care in morning... then back-to-back meeting after come back.... lunch... back-to-back meeting again... need to step out again...
Anyway, that's life.
Multiple features (variables) and Vectorization
In "multiple features", Andrew uses crispy language explained how to simplify the multiple features formula by using vector and dot product.
In "Vectorization Part 1", Andrew introduced how to use NumPy to do dot product and said GPU is good at this type of calculation. Numpy function can use parallel hardware (like GPU) to make dot product fast.
In "Vectorization Part 2", Andrew further introduced why computer can do dot product fast. He used gradient descent as an example.
The lab was informative, I walked through all of them though I've known most of them before.
Questions for helping myself learning
I created the following questions to test my knowledge later.
- Why is there a arrow hat on variable?
A: Optional, but nice to have to indicate it's a vector not a number. - What does dot product do?
A: the dot products of two vectors of two lists of numbers W and X, is computed by checking the corresponding pairs of numbers. - What is multiple linear regression?
A: It's the name for the type of linear regression model with multiple input features, which is contrast of "univariate regression". - Is multivariate regression same as "multiple linear regression"?
A: No. multivariate regression is something else. - Why do we need to vectorization?
A: Codes with vectorization can perform calculations in much less time than codes without vectorization on specialized HW.
This matters more when you're running algorithms on large data sets or trying to train large models, which is often the case with machine learning.
What is x(4)1 in above graph?