Motivation
I wanted to absorb vast amounts of podcasts, videos, and articles about markets, AI, and product strategy. The problem was obvious: too much material, not enough time to process everything manually.
My solution felt elegant. I built a pipeline that automatically generates one-page summaries of podcasts and videos. The system pulls transcripts, sends them to Gemini (which handles long text better), then converts everything into digestible summaries. I prompt-engineered it to break down speakers' reasoning patterns so I could learn their cognitive frameworks and internalize them.
Reading ten-plus summaries daily, I expected rapid knowledge accumulation. Instead, my retrieval rate was spectacularly low. This puzzled me because I knew I could learn fast based on my time in school, especially when it came to conceptual and word-heavy material rather than technical subjects.
The answer came when I discovered the concept of "time in material."
Time in Material
Time in material means the irreducible minimum duration your brain needs with raw content to form genuine understanding rather than surface familiarity.
Examples where shortcuts fail:
Coding intuition: You can read about programming patterns, but developing instinct for elegant solutions requires hours debugging your own messy code
Investment pattern recognition: You can study market theories, but recognizing subtle signals requires watching price action and trading headlines for months
Understanding complex philosophy: Reading a summary of Kant gives you the conclusions, but grappling with the original dense prose for hours is how you develop the reasoning patterns
Your brain requires sustained exposure to raw content to form the necessary neural pathways. Robert Bjork describes these as "desirable difficulties" - when your brain works harder to process information, it creates deeper encoding. The cognitive effort of working through complete material versus consuming pre-processed summaries determines whether you develop surface familiarity or genuine understanding.
When you spend genuine time grinding through complex ideas and building connections, your brain produces higher-quality thought patterns, deeper mental models, more nuanced reasoning capabilities. For example, when reading about market dynamics, grinding means working through multiple contradictory viewpoints, sitting with the discomfort of not immediately understanding, tracing how different economic theories connect to actual market behavior, and slowly building your own framework for why certain patterns emerge. This process cannot be shortcutted.
There remains no substitute for doing the mental heavy lifting yourself.
Cognitive Exhaust
But time in material produces something more valuable than just retention. Hard cognitive work generates what I call "cognitive exhaust" - the sophisticated byproduct of sustained mental effort.
AI models are embedding matrices that respond to input quality. When you have developed complex mental frameworks through deep work, your prompts carry semantic density that generates far more interesting outputs. The AI amplifies the sophistication of your cognitive exhaust rather than compensating for its absence.
Consider these examples:
Film/Game Development AI: A director who has spent years developing intuition about character psychology and narrative structure can prompt AI with nuanced creative constraints that reflect deep understanding of storytelling mechanics. Their cognitive exhaust - the sophisticated frameworks they have internalized - becomes the foundation for AI-generated content that captures subtle emotional beats and narrative complexity.
Architecture AI: An architect whose cognitive exhaust includes internalized understanding of spatial relationships, material properties, and human behavior patterns can direct AI with parametric rules that reflect years of design thinking. The AI optimizes within boundaries that only emerge from extensive time in material.
The pattern is clear: AI amplifies human design quality exponentially.
The Missing Piece
Most people use AI to offload heavy thinking entirely. They get technically correct answers to good questions, but they miss the grindy part where actual learning happens. Their brains never get the chance to internalize the reasoning patterns because they bypass the cognitive work required for deep encoding.
They receive information without developing the mental frameworks to generate similar insights independently.
Personal Development/Life 🌱
In my own life, I have been spending more time reading books, diving into long-form blogs, listening to full podcasts rather than summaries:
Ask AI which specific chapters address my exact interests, then read those chapters completely
Use AI to filter through podcasts to check if they contain rehashed content I already know
When podcasts offer genuine new perspectives, listen to full episodes at 1.5x speed rather than reading summaries
Use AI to discover content related to abstract concepts I am exploring, leveraging its ability to ground vague intuitions into concrete recommendations
Use AI to grease the wheels of your thinking, not replace the engine. Reduce friction, not eliminate it entirely - you still need to do the fundamental work. This positions you to direct future models like GPT-5 or Claude Opus 5 to transcend your current cognitive abilities rather than substitute for them.
The irony is complete: in our rush to make learning efficient, we risk making ourselves incapable of directing the very systems designed to augment human intelligence. Time in material remains the irreducible foundation for producing the cognitive exhaust that AI amplifies.
Note: This post was written with significant AI assistance. I provided the core insights and observations, then worked with Claude to develop, structure, and refine the ideas through back-and-forth conversation. The frameworks and personal experiences are mine, but the execution heavily on AI for research, organization, and polish.