Distributed computing frameworks like Hadoop and Spark have enabled processing of “big data” sets — but that’s not enough for modeling surprise/rare “black swan” or complex events. Just think of scenarios in disaster planning (earthquakes, terrorist attacks, financial system collapse); biology (including disease); urban planning (cities, transportation, energy power grids); military defense … and other complex systems where unknown behaviors and properties can emerge.
They can’t be modeled based on (by definition impossible) limited data. And parallelization for this is hard. But what if companies and governments *could* answer these seemingly impossible questions — through simulations? Especially ones where we can directly merge in knowledge and cues from the real world (sensors, sensors everywhere)? CEO of Improbable Herman Narula and Stanford University professor-in-residence at CFI Vijay Pande discuss this and more with Chris Dixon in this episode of the CFI Podcast. And as Herman says, “the cool stuff only happens at scale”.
The CFI Podcast discusses the most important ideas within technology with the people building it. Each episode aims to put listeners ahead of the curve, covering topics like AI, energy, genomics, space, and more.