Thoughts & Writings
Exploring the intersection of AI, Ethics, and Systems through in-depth articles, tutorials, and research notes.
December 2025
The Quest for the Next Transformer: A First-Principles Exploration of Sequence Modeling
An honest examination of whether we can surpass the transformer architecture for text prediction.
December 2025
Brain-Inspired Mechanisms for Sequence Modeling: A Deep Dive Into Biological Computation
The human brain processes language on 20 watts. GPT-4 training consumed ~50 million watts.
psychology