Marc Andreessen
Co-creator of Mosaic (first graphical web browser) and Netscape; co-founder of a16z; early shaper of the commercial internet and now a prominent voice on AI’s structural implications.
Last updated: 2026-04-13
Overview
Marc Andreessen created Mosaic at UIUC in 1993 — the browser that made the internet accessible to normal people. He then co-founded Netscape (which triggered the browser wars) and later a16z, one of the most influential venture firms in Silicon Valley. He has been investing in AI from the beginning, and his current thinking is shaped by a firsthand sense of how transformative computing platforms emerge and scale.
His lens on AI draws heavily on analogies to the web’s architecture: text protocols over binary, view source as the key to democratization, leveraging latent Unix power rather than building new layers from scratch.
The 80-Year Overnight Success
Andreessen’s framing for the current AI moment: it’s an overnight success drawing on an 80-year wellspring. The original neural network paper was 1943. The Dartmouth AGI conference was 1955 (they thought 10 weeks together would get them to AGI). He lived through the 1980s expert systems boom and crash. AI researchers spent careers on this, many died before seeing it work.
“Now we know that neural networks are the correct architecture. There was a 60–70 year run where that was controversial.”
The critical structural point: this time is different not because of hype, but because it’s working. Four fundamental breakthroughs, all actually functioning:
- LLMs — language model scaling
- Reasoning — o1, R1; answered the skeptics who said “it’s just pattern completion”
- Agents — OpenClaw; the December 2024 coding breakthrough
- RSI — recursive self-improvement / auto research
The coding breakthrough was the key signal: when Linus Torvalds says AI coding is better than he is, you know it’s real. And if it works in coding (the hardest domain), it works everywhere.
The Unix Agent Architecture (Pi + OpenClaw)
Andreessen’s framing of the architectural breakthrough behind modern agents:
An agent is: LLM + bash shell + file system + markdown + cron job
Every component except the LLM was already known, understood, and battle-tested for decades. The breakthrough was composing them:
- Shell: full Unix power is already latent in the shell. Every Mac/iPhone/Linux server already runs on a shell. Enormous numbers of command-line interfaces to everything already exist. “The full power of your computer is available at the command line level.”
- File system: the agent’s state lives in files. This is the key implication — the agent is its files.
- Markdown: human-readable state representation
- Cron/loop: the heartbeat that wakes the agent up and keeps it running
Agent = files. Profound implications:
- Model-agnostic: swap out the underlying LLM, keep all the state. The agent’s memories and capabilities persist. Like recompiling with a different compiler — same program, different runtime.
- Shell-agnostic: migrate to a different execution environment (different server, different OS) and the agent follows.
- Self-extending: agent has full introspection of its own files. It can rewrite itself. You can tell it “add this new capability to yourself” and it will — find the API, write the code, wire it up. “Tell your claw to add the sleep tracking integration and it’ll do it while you sleep.”
- Self-migrating: tell your agent to move to a different runtime environment; it will orchestrate the migration.
On MCP and fancy protocols: “No, we don’t need those. Just command-line interfaces.” The Unix shell already has latent capability to talk to everything.
The analogy to Unix mindset vs OS/360: IBM’s monolithic OS (castle in the sky, works well but unapproachable) vs Unix’s modular shells and pipes (composable, teachable, hackable). Modern agents are the same victory of Unix-style composition — against the temptation to build new protocols, new languages, new runtimes from scratch.
Death of the Browser and UI
“Who is going to use software in the future? Other bots.”
If agents are the primary consumers of software, the entire rationale for UI design changes. The browser made sense because humans needed a readable, navigable interface to information. If the reader is an agent, it doesn’t need a browser — it has a shell.
This extends further: in 10 years, “salient concept of a programming language” may not exist. Models code in whatever is optimal. They can translate between any languages. They can reverse-engineer binaries. Eventually they may emit model weights directly.
Human-built software systems have always compensated for human limitations (slow reading speed, limited working memory, need for visual affordances). Remove those limitations and the abstractions change entirely.
Human readability as a design principle: Andreessen’s counterintuitive Mosaic/HTTP design choice was text protocols over binary when bandwidth was scarce. The logic: assume infinite bandwidth → design for human readability → make the system understandable → “view source” democratizes learning → creates demand for bandwidth → supply gets built. The same logic applies to markdown/plaintext for agents today.
AI + Crypto Grand Unification
The original internet missed payments. HTTP 402 Payment Required was defined but never implemented. Andreessen has been saying this for years.
Now it will be fixed, for two reasons:
- Crypto stable coins exist: internet-native money, no friction
- Agents obviously need money: you can’t deploy a useful personal agent without giving it a way to transact
“AI is the crypto killer app.” Early adopters are already giving their OpenClaw bank accounts and credit cards. It’s obvious they need to — the alternative is the agent being unable to complete tasks that require payment.
The future: agents transact on your behalf constantly, at micropayment scale. Previously economically unviable (human friction), now trivially viable.
Why No AI Winter This Time
Andreessen lived through the dot-com crash. Global Crossing raised debt to build fiber on a scaling law (internet traffic doubling quarterly) that stopped holding. $2 trillion wiped out. Data centers took 15 years to fill.
The parallel exists. But four counter-arguments:
- Blue-chip investors, not overleveraged startups: Microsoft, Google, Amazon, Meta have unlimited debt capacity and have never used it — nothing like Global Crossing’s leverage structure
- Every dollar immediately turns into revenue: current demand exceeds supply; there’s no overbuild yet
- We’re getting the sandbagged version: supply constraints mean models are worse than they could be with more compute. Even without technical progress, 10x more GPU manufacturing = much better models
- Old chips becoming more valuable: inference chips appreciate over time as software improves — this has literally never happened before in chip history. The inverse of the Burry thesis.
Connections
- coding-agent — Unix agent architecture: LLM + shell + files + cron; agent = its files
- agent-first-software — death of UI; bots consuming software for bots
- auto-research — RSI as the 4th fundamental breakthrough; Andreessen’s validation
- model-context-protocol — Andreessen prefers shell/CLI over MCP for agent tool access
Sources
- Marc Andreessen introspects on Death of the Browser, Pi, OpenClaw, and Why “This Time Is Different” — Latent Space podcast, added 2026-04-13