I’m not an ML person, but still. That guy has a serious gift for explaining stuff.
His video on the uncertainty principle explained stuff to me that my entire undergrad education failed to!
I'd like to challenge this idea.
I don't believe he's more gifted than other people. I strongly believe that the point is he spent a lot of time and effort to get better at explaining stuff.
He contemplated feedback and improved his explanations throughout the years.
His videos are excellent because he poured himself into making them excellent, not because he has a gift.
In my experience the professors who lack this ability do so because they don't put enough effort into it, not because they were born without it.
Most likely it is a slightly misused idiom rather than intending to convey that the teaching ability was obtained without effort.
Everyone can improve with practice, but some people really are gifted.
That makes the whole concept tick.
gifted and spending time to get it right are not mutually exclusive
There is a course reader for CS109 [1]. You can download pdf version of this.
There is also book[2] for excellent caltech course[3].
[1] https://chrispiech.github.io/probabilityForComputerScientist...
[2] https://www.amazon.com/Learning-Data-Yaser-S-Abu-Mostafa/dp/...
https://chrispiech.github.io/probabilityForComputerScientist...
https://www.amazon.com/Learning-Data-Yaser-S-Abu-Mostafa/dp/...
Hm... Saw that, I have used it multiple times in my comment. I was just trying to convey the meaning.
What is right use of word? What would be right word to use here?
Part of training LLMs involves extensive human feedback, and many LLM makers outsource that to Africa to save money. The LLMs then pick up and use African English.
See the link in this comment [1] for an interesting article about this.
I will add a great find for starting one's AI journey https://www.youtube.com/watch?v=_xIwjmCH6D4 . Kind of needs one to know intermediate CS since 1st step is "learn Python".
There is also book[2] for excellent caltech course[3].
[1] https://chrispiech.github.io/probabilityForComputerScientist...
[2] https://www.amazon.com/Learning-Data-Yaser-S-Abu-Mostafa/dp/...
I actually started building my own neural network framework last week in C++! It's a great way to delve into the details of how they work. It currently supports only dense MLP's, but does so quite well, and work is underway for convolutional layers and pooling layers on a separate branch.
It delves into theoretical underpinnings of probability theory and ML, IMO better than any other course I have seen. (Yeah, Andrew Ng is legendary, but his course demands some mathematical familarity with linear algebra topics)
And of course, for deep learning, 3b1b is great for getting some visual introduction (https://www.youtube.com/watch?v=aircAruvnKk&list=PLZHQObOWTQ...).