Is This AI’s February 2020 Moment?
Notes from the article that broke the internet last week
Earlier this month, an essay comparing artificial intelligence to February 2020 spread rapidly across social media, gathering tens of millions of views and reigniting anxiety about the future of white-collar work.
The analogy was stark.
In early 2020, most people saw scattered warnings about a virus overseas and dismissed them. Within weeks, daily life reorganized. Offices closed. Schools went remote. Normal evaporated.
The claim now is that AI may be sitting in a similar phase. A moment where the signals are visible, but the scale of the coming shift is widely underestimated.
The essay’s author, HyperWrite CEO Matt Shumer, argued that recent frontier model releases mark a genuine inflection point. Systems that once assisted now act autonomously. Coding was the first domino, he wrote, because AI can now help improve itself. If AI can perform most of the cognitive work that happens on a screen such as drafting, analyzing, modeling, and coding, then knowledge work is broadly exposed.
Not eventually.
Soon.
The piece resonated because parts of it feel undeniably true.
But the real debate is not whether AI is advancing.
It is whether acceleration translates into shock, or something more complicated.
The Acceleration Case
The accelerationists argue we have crossed a threshold.
Their core claims are straightforward:
Model capability is improving nonlinearly.
AI can now contribute meaningfully to its own development.
Complex tasks that once took days now take hours.
Recursive feedback loops compress timelines.
Anthropic CEO Dario Amodei has publicly suggested AI could eliminate a significant share of entry-level white-collar jobs within one to five years.
In this view, the curve is steep. Institutions will adjust, but only after disruption is visible.
The constraint is time.
The Friction Case
Economists such as Erik Brynjolfsson urge caution.
General-purpose technologies often improve quickly while institutions reorganize slowly. Electricity transformed industry, but factories took decades to redesign around it. Computers spread rapidly, yet productivity gains lagged.
From this perspective:
AI will reshape work.
Many tasks will be automated.
Firms will not restructure overnight.
Jobs evolve before they disappear.
Automating part of a role does not automatically eliminate the role itself. Enterprises operate within legal frameworks, procurement cycles, regulatory systems, and cultural norms.
The constraint is institutional inertia. I discuss further about this in this article
The Skeptical Case
Some researchers, including Gary Marcus, question whether recent gains can be extrapolated cleanly into the future.
Concerns include:
Reliability limits.
Persistent hallucinations.
Overinterpretation of benchmark improvements.
The assumption that scaling continues smoothly.
This camp does not deny progress.
It questions the smoothness of the curve.
The constraint is technical uncertainty.
The Concentration Case
A quieter but increasingly important thread centers on power.
Frontier AI models require enormous capital, compute, and energy. A small number of labs build the most capable systems. If intelligence becomes infrastructure, ownership matters.
In this frame:
Enterprises rent intelligence rather than build it.
Productivity gains may accrue disproportionately to capital.
Competitive advantage depends on access tiers.
Market power may consolidate before labor markets collapse.
The defining issue may not be how many jobs disappear.
It may be who controls the intelligence layer.
The constraint is governance and capital concentration.
What Is Actually Being Debated?
All sides agree on one point.
AI capability is improving.
The disagreement lies elsewhere:
How quickly does capability translate into restructuring?
How much task automation eliminates a job?
Who captures the productivity gains?
Does scaling continue?
Does power concentrate before displacement becomes visible?
The viral essay captured one interpretation. Its clarity and urgency helped millions engage with the issue.
But acceleration is only one variable.
Friction matters.
Uncertainty matters.
Power matters.
February 2020 felt sudden only in retrospect. The signals were there long before the shock became obvious.
Whether AI unfolds the same way depends less on model capability and more on how institutions, markets, and governments respond before the inflection becomes undeniable.
That is the question worth watching.
JS
I use this space to share ideas and voices that help founders and creators think more clearly, act more intentionally, and build systems that last.
My work focuses on helping entrepreneurs bring order to growth and clarity to complexity, moving from chaos to control, and from control to sustainable growth.
Some resources you might like to try:
See your system. Find your leverage. Evaluate the key drivers of long-term performance
Discover which operating challenges are getting in the way of business growth
Upcoming Workshop for Professionals thinking about going solo
Some reader’s favorites
How Anthropic Set Off a Trillion-Dollar Software Repricing | Agentic AI and the Future of SaaS
AI Agents Are Now Hiring Humans: The Rise of RentAHuman and the Agent Economy
AI Agents Just Built Their Own Social Network. Humans Are Not Allowed to Post.
Build to Thrive | The AI Blueprint | Week of February 2nd, 2026
How I Scaled My Business Without Hiring: Building My First AI Agents for $0




"AI’s February 2020 Moment" - Hadn't read that article. But sounds about right. Thanks for sharing!
Would be interesting to pull people and see what percentage fall into either case