With the world constantly generating more data, unlocking the full potential of AI means a constant need for faster and more resilient hardware.
In this episode – the second in our three-part series – we explore the challenges for founders trying to build AI companies. We dive into the delta between supply and demand, whether to own or rent, where moats can be found, and even where open source comes into play.
00:00 – Supply and demand
02:44 – Competition for AI hardware
04:32 – Who gets access to the supply available
06:16 – How to select which hardware to use
08:39 – Cloud versus bringing infrastructure in house
12:43 – What role does open source play?
15:47 – Cheaper and decentralized compute
s19:04 – Rebuilding the stack
20:29 – Upcoming episodes on cost of compute
The CFI Podcast discusses the most important ideas within technology with the people building it. Each episode aims to put listeners ahead of the curve, covering topics like AI, energy, genomics, space, and more.