wwww_3node_tenstorrent/specs/blogs/openfuture/openfuture_content.md
2025-07-24 09:38:35 +02:00

9.1 KiB
Raw Blame History

Open Future - Initial Content

Main Theme

AI is changing the laws that once governed computing.

Timeline of OPEN

  • 1950: Mainframes, Univac, IC Chip
  • 1960: Minicomp, 12-bit PDP-8, DRAM, IBM Anti-trust Lawsuit
  • 1970: PC, Intel 4004, Minitel, Unix
  • 1980: Browser, WWW, Linux, Mozilla
  • 1990: Mobile, iPhone, "Open Source", Ethernet
  • 2000: Cloud, ChatGPT3, Android, Red Hat IPO
  • 2010: AI, DeepSeek, PCIe, Red Hat Sells To IBM, RISC-V

Key Ideas

The Critical Juncture

AI is valuable enough to warrant massive investment. As Andrej Karpathy said, it's "Software 2.0". AI creates knowledge that we didn't have before and can navigate inconceivable amounts of data and complexity.

Companies are making choices today that will determine the future - between closed systems (concentrating leverage) or open systems (affordable, transparent, modifiable).

The Risk of Concentration

If Bell's Law breaks fully, AI will be the first computing revolution that doesn't increase access, but instead concentrates it. This threatens to tip consolidation into full enclosure.

Open as Existential

"It is clear to us that open is existential."

If AI eats everything, then open versus closed is a referendum on the future shape of society as a whole.

RISC-V as the Hardware Equivalent of Linux

RISC-V launched in 2010 at UC Berkeley as a free, open standard alternative to proprietary architectures. It's gaining incredible adoption from companies like Google and Tenstorrent.

The AI Stack Analysis

Current state shows parts open, parts closed:

  • Hardware: CLOSED (black box, reliant on companies)
  • Low Level software: CLOSED (proprietary parallelization software)
  • Models: MIXED (leading ones closed, open ones limited)
  • Applications: CLOSED (even with open models, built using cloud APIs)

The Vision

Opening up AI hardware with open standards like RISC-V would trigger a domino effect upstream, enabling "a world where mainstream technology can be influenced, even revolutionized, out of left field."

Part 1: How We Got Here

Bell's Law and Computing Evolution

Until recently, Bell's Law gave us an accurate frame for understanding computing revolutions, stating that each decade a new class of computing emerges, resulting in a fundamental shift in access.

We went from mainframes in the 1950s, to minicomputers in the 1960s, to super computers in the 1970s, to personal computers in the 1980s, to the world-wide web in the 1990s, and mobile in the 2000s.

These revolutions allowed us to make computers that were much more accessible simultaneously driving performance up 10x while also driving cost down 10x. In 1981, a fully loaded IBM PC cost $4500. Today, an iPhone, which is many millions of times faster, retails for $1,129. Through this process we got very good at building very powerful computers with very small chips.

The AI Revolution Challenge

However, prices aren't dropping with the advent of Artificial Intelligence. While cost per math operation is going down, the actual cost of inference per token is still climbing as models are getting larger (e.g. GPT4.5), doing more work (e.g. "reasoning models"), and doing work that is more intensive (e.g. new image generation). AI datacenters are orders of magnitude more powerful than previous generations with spending rising by tens of billions year-over-year.

Why is this computer class more expensive? AI is extremely physically intensive requiring more silicon, more energy, more resources. From shifting the physics of compute at the transistor level to building out the global infrastructure of AI data centers, this revolution is pushing against the physical limitations of human industry.

Part 2: A Closed World

The Historical Pattern

This isn't the first time we've been presented with a choice between a closed or open future. In fact, we're living in a closed world today because of choices made for us 40+ years ago. Early minicomputer and PC culture was dominated by a hacker ethos defined by "access to computers... and the Hands-On Imperative". By the late 90s and early 00s, PC development became dominated by Windows and Intel at the cost of limiting innovation while hamstringing competitors and partners alike.

Market Concentration Examples

Just look at WinTel's OEM partners, like Compaq, which struggled to hit 5% operating margins in the late 90s, according to SEC filings. Dell, during the same time period, absolutely revolutionized supply chains, and typically enjoyed margins around 10%. Compare this to Microsoft and Intel, which often tripled or quadrupled those figures in the same period, with Microsoft hitting 50.2% margins in 1999. Some have jokingly referred to this as drug dealer margins. In 2001, Windows had >90% market share, and almost 25 years later, it still has >70% market share.

How Closed Worlds Form

How do closed worlds form? One word: swamps. A swamp is a moat gone stagnant from incumbents who have forgotten how to innovate.

There are many ways to produce a swamp. They can protect a product by overcomplicating it, adding unnecessary proprietary systems and layers of abstraction. They can charge rents, in the form of license fees. They can pile on features just enough to justify an upgrade to customers, while staying disconnected from what they actually need. And if they want to get really clever, they can offer something "for free" as an inseparable part of a bundled service in order to lock out competition.

However it happens, what started as innovation becomes just an extra tax on the product, erecting monopolies instead of creating real value. These companies become incentivized to preserve the status quo, rather than changing.

Part 3: An Open World

The Power of Open Source

Open source has a way of infiltrating crucial computing applications. The internet runs on it. The entire AI research stack uses open source frameworks. Even proprietary tech relies on it with 90% of Fortune 500 companies using open source software. There wouldn't be macOS without BSD Unix, Azure without Linux, or Netflix without FFmpeg.

Open Standards and Mass Adoption

Open source and its hardware equivalent, open standards, have repeatedly catalyzed mass adoption by reducing friction and enabling interoperability. Robert Metcalf says the openness of ethernet allowed it to beat rival standards. DRAM enabled the mass adoption of PCs with high-capacity, low-cost memory, while PCIe enabled high-speed interoperability of PC components. Similarly, Open Compute Project specs, used by Meta and Microsoft among others, standardized rack and server design, so components could be modular and vendor-agnostic.

RISC-V: The Hardware Equivalent of Linux

RISC-V is the hardware equivalent of Linux for AI hardware. It launched in 2010 at UC Berkeley as a free, open standard alternative to proprietary architectures like Intel's x86 and ARM. Its open nature allows it to be deeply customized, making it especially desirable for AI and edge computing applications, and it is royalty-free. RISC-V's ISA is gaining incredible adoption, with companies from Google to us at Tenstorrent adopting it for custom silicon.

Global Talent Pool

Open systems also attract a global talent pool. Linux itself is the shining example of this, constructed by thousands of engineers, with significant contributions coming both from independent outsiders and employees of major players like Intel and Google.

We believe open is the default state what remains when artificial boundaries fall away. The only question is how long those boundaries hold, and how much progress will be delayed in the meantime.

The AI Stack Analysis

Current State (Closed Today)

  • Hardware: CLOSED - Most hardware today is a black box, literally. You're reliant on a company to fix, optimize, and, at times, even implement your workloads.
  • Low Level software: CLOSED - Most parallelization software is proprietary causing unnecessary lock-in and massive switching costs.
  • Models: MIXED - Models are mixed, but most of the leading ones are closed. The models that are open share limited data, with little to no support, and have no promises of staying open in the future.
  • Applications: CLOSED - Even if an application is using an open source model, most are built using cloud platform APIs. This means your data is being pooled to train the next gen models.

The Vision (Open Future)

Opening up AI hardware, with open standards like RISC-V, and its associated software would trigger a domino effect upstream. It would enable "a world where mainstream technology can be influenced, even revolutionized, out of left field." This means a richer future with more experimentation and more breakthroughs we can barely imagine today, like personalized cancer vaccines, natural disaster prediction, and abundant energy.

The Stakes

There's an old Silicon Valley adage if you aren't paying you are the product. In AI, we've been paying steeply for the product, but we still are the product. We have collectively generated the information being used to train AI, and are feeding it more every day.

In a closed world, AI owns everything, and that AI is owned by a few. Opening up hardware and software means a future where AI doesn't own you.