In César Hidalgo’s 2015 book, Why Information Grows, there’s a passage on how information is packaged into a product. To make the point, Hidalgo asks us to imagine a Bugatti Veyron. If you put the car on a scale, you’d find that it’s worth its weight in silver. Now imagine the NTSB (if it were to safety test $2 million cars) runs the Veyron into a wall at 70 miles per hour. What’s the value by weight now? Something far less. But the weight is the same, so where did the value go? It was in the arrangement of the atoms – as the value of all information is.
This occurred to me during a recent experiment with perplexity.ai, a “conversational search engine” powered in part by GPT4. Ask it: how fast is human knowledge growing? It replies that “the volume of human knowledge is doubling every 12 months.” A follow-up query tells us that this “volume of human knowledge” includes the 4.4 billion minutes people spent on TikTok each day in 2023 and the 500 hours of content uploaded to YouTube every minute. One assumes this volume also includes, say, the Associated Press’ 2023 Pulitzer Prize winning reporting on Mariupol, but the distinction seems lost. Either way, the point is that TikTok, et al., is what information looks like when you ram it into a wall at 70 mph. Or another way: there’s something on the order of 1.1 million Terabyte/tons of mangled information being created every day and that’s not good.
Part of the solution is increasing the volume of good information. And in the next phase of Civil, we are working with emerging video editing/generative AI tools; partnering with AI companies that are developing LLMs to understand and track mis- and disinformation; and identifying creators with whom we can share the internal processes and tools we’re building. The goal is to empower long form creators (journalists, scientists, etc.) with the tools to easily create short form videos as teasers for long form work – to drive curiosity about our world and each other.
Recall Mark Twain’s quip that a lie can get halfway around the world before the truth gets its pants on. Our goal is to use AI to help give truth a shot.
The above video on misinformation and the Texas border is an example. This is based on a long Wired magazine piece that explored the role of Russian influence in the online discussion (if one can call it that) about the tension between Governor Abbot and President Biden over border security. By creating a short form summary designed for social media, we can encourage curiosity about the subject and help drive consumers back to the long form article for more information. Doing this at scale of hundreds and eventually thousands per day (all from hundreds of high-information creators) can capture a meaningful audience.
We have not emailed these short form videos (but you can find them under our ‘Shorts’ tab) because we don’t want to overload your inboxes - and because they are designed to drive traffic from social to Substack. But for those who may have missed them, we wanted to share an example. In the coming weeks, look for more updates on our editorial board and longer-from content library we’re building. And, as always, our deepest thanks.