So I'm looking for resources that bridge the gap, not purely computational "cookbook" type resources but also not proof-heavy textbooks. Ideally something that builds intuition for the structures and operations that show up all over ML.
https://math.mit.edu/~gs/learningfromdata/
Although if your goal is to learn ML you should probably focus on that first and foremost, then after a while you will see which concepts from linear algebra keep appearing (for example, singular value decomposition, positive definite matrices, etc) and work your way back from there
I hadn't known about Learning from Data. Thank you for the link!
Less popular techniques like normalizing flows do need that but instead of SVD they directly design transformations that are easier to invert.
QPs are solved by finding the roots (aka zeroes) of the KKT conditions, basically finding points where the derivative is zero. This is done by solving a linear system of equations Ax=b. Warm starting QP solvers try to factorize the matrices in the QP formulation through LU decomposition or any other method. This works well if you have a linear model, but it doesn't if the model changes, because your factorization becomes obsolete.
Apply directly... to what? IMO it is weird to learn theory (like linear algebra) expressly for practical reasons: surely one could just pick up a book on those practical applications and learn the theory along the way? And if in this process, you end up really needing the theory then certainly there is no substitute for learning the theory no matter how dense it is.
For example, linear algebra is very important to learning quantum mechanics. But if someone wanted to learn linear algebra for this reason they should read quantum mechanics textbooks, not linear algebra textbooks.