Discussion about this post

User's avatar
Ryan S's avatar

Regarding homework: could be time to take a cue from sociology and have CS grad students... write papers *gasp*

Expand full comment
Alexandre Passos's avatar

I think there is these days interesting phenomenological math not ontological math you can justify well in ML. Things like "assuming you want your neural network's activations or gradients to be invariant to the number of layers, this is how you initialize / normalize / etc", or "these are useful power law models of how neural networks learn", or "neural networks are obviously not quadratics but if you squint and pretend that they are you can predict a lot of the curves seen during training". Which I think vibes very well with your argument that generalization is an axiom and with the general vibe that ML made not much progress while it tried to treat things as math but only unblocked when it started trying to treat things as physics.

Expand full comment
3 more comments...

No posts

Ready for more?