Machine Learning¶
I find theoretical math and machine learning to be pretty fun. I think that good practical techniques that work well are derived from a theoretical root.
-
Twitch on Theory In Convex Optimization Deriving everything we want in optimization from Taylor Theory and with small modification on some assumptions or the way we design things, we get completely different families of algorithms.
-
Constraints as Guidance, Not Hindrance All hard problems that we want to solve can be framed as a constraint solving process if we look at them from a particular perspective. Both in math and in life.
-
Unfolding Stochasticity Sequentially Modeling interactions between stochasticity across time sequentially through the key representational example of Random Walk.
-
Lend It Some Confidence There are deep connections between statistics and probability, even on very basic statistics levels.
-
From Random Walks to Generating Molecules From random walks to graph transformers to hierarchical graph generation, tracing how the field learned to represent and generate graph-structured data.
-
Optimism in the Face of Uncertainty From the multi-armed bandit problem to AlphaGo, deriving the Upper Confidence Bound and why the mathematically optimal strategy for minimizing regret is to explore the unknown.
-
Walking Towards Singularity in All Reality From the Bellman equation to contraction mappings to Q-learning --- tracing how a simple recurrence converges to the singularity, the unique optimal value in the space of all possibilities.