Nexus, Laura Marling, Process informing substance
- Matt Carona

- Apr 21
- 9 min read
When I started writing these entries, I was explicit about keeping the bar low. My first sentence: This is a low stakes beginning. It was telling, unintentionally, because “low-stakes” should have been hyphenated. But, importantly, I did not go back and correct it. On the surface, lowering the stakes can seem like settling, or worse, accepting slop — but I don’t actually think it’s antithetical to the virtues of improvement or growth. Rather, I often find removing the pressure to be necessary for embarking on any meaningful pursuit, particularly for those who, like me, are all too familiar with barriers of doubt.
The innovation team at Google (known as Google Labs) often says they’re trying to build products that “lower the floor and raise the ceiling”. This is a pithy, catchy way of explaining an intention to increase access to technological capabilities while also expanding the boundaries of what’s possible*.* And I’d like to (shamelessly) co-opt this dichotomy-busting tagline: it’s often the case, for me at least, that I need to lower the floor enough to just step into the damn room. Only then can I even begin to walk around, knock on some walls, open some windows, and then maybe think: hey, what else can I do in here?
Sometimes you gotta let yourself off the hook to just get moving. And then (and this is the best part) once you’ve got some momentum, be delightfully, humbly surprised about where it all takes you. Whatever your thing is, maybe try lowering the bar so that you can raise the stakes.
Nexus
Now, a somewhat forced segue. I recently finished Yuval Noah Harari’s (audio)book Nexus: A Brief History of Information Networks from the Stone Age to AI and have been wanting to reflect on a few things, but I’ve been somewhat paralyzed by where to even begin in this summarization. So, this is not a recap, but just a few things that I found interesting.
In typical Yuval-fashion, the book is a comprehensive, yet accessible, and insightful historical overview that provides much needed context to the situation we find ourselves in. He covers the history of information systems and the direct implications these networks have on our societal structures and political capabilities. It’s a pertinent read given the ongoing, escalating AI fervor. The hype around AI is as frothy as can be — predictions of an “AGI” moment continue to proliferate utopian and doomsday scenarios with timelines that vary from 2027 to 30 years away to it apparently having arrived last Tuesday. But regardless of which forecast to believe, this technology is undoubtedly having, and going to continue to have, profound impacts.
This can all feel overwhelming, but Harari’s book serves as a critical reminder that technology (like the future) is not deterministic — and we must avoid any thinking that robs us of our agency or absolves us of our responsibility.
Yuval critiques the “naive view of information”, which often stems from Silicon Valley and states that more information is always good, because more information leads to more truth.
Clearly, we’re all living witnesses that truth is not thriving in this always-on-information age. The naive view of information ignorantly overlooks (or maliciously ignores) that fact that a lot of information is wrong, harmful, and deceitful — and the fact that “information technology” has been leveraged throughout history for nefarious purposes.
That’s not to say more information is necessarily bad. Distributed information systems were foundational to the development of democracy at scale.
Prior to development of modern information technology, there are no examples of large scale democracy anywhere. Mass media makes mass democracy possible.
But what’s critical is that information networks contain self-correcting mechanism that uphold democratic pursuits of truth. Without this, information systems can be taken over by authoritarian regimes who seize control and then claim to be infallible.
Whereas democracy has overlapping self correcting mechanisms to keep each other in check, totalitarianism has overlapping surveillance mechanisms to keep each other in order.
The founding father’s greatest legacy is that they provided descendants the self correcting mechanisms to fix their mistakes.
To address the elephant in the room, we’re currently witnessing unprecedented threats to these very self-correcting mechanisms in the US. Not that we need another reminder to defend our democratic principles, but there’s sure a lot on the line right now — particularly as our information ecosystem will fundamentally evolve with the rise of AI-generated information.
Information used to have to pass through humans. But now computer to computer systems communicate within seconds without any human oversight. For the first time there’s a revolution in membership of our society - computers are now members alongside sapiens…..For the first time ever democracy must deal with a cacophony of non human voices.
Gulp.
One of my concerns is that dwindling self-correcting mechanism will further erode public trust in institutions. As society gets more complex, we increasingly rely on systems that we inevitably will not be able to fully understand. What percentage of people could explain the inner workings of how money moves across bank accounts (not the routing numbers, but the actual code and security protocols that underpin a transfer)? I know I can’t. I just say I want $100 to go here and in 2-3 days it’s available. Now if you extrapolate to all the other complex systems that underpin our lives (e.g. public health, electricity, global supply chains), you’d break your brain trying to understand it all. So we rely on select institutional experts to know the details and accept levels of abstraction for the rest of us. But if we don’t trust said experts or the self-correct mechanisms of the institutions in which they’re contained, this then opens up a slippery slope of interrogation known as “do your own research”. And we know where that gets us.
Yuval captures this concern.
The increasing unfathomability of our information network is one of the reasons for recent waves of populist parties and charasmatic leaders. When people feel overwhelmed and don’t understand they become easy prey to conspiracy theories and turn for salvation to something they do understand: a human.
To be clear, I’m still hopeful for democracy. I just think it’s going to take diligence, creativity, and political will to push forth democratic principles in a complex time of radical technological change.
I’d also like to believe in the power of the truth — it eventually comes back to haunt us all.
There’s a story in Nexus about Stalin’s death: apparently in his later days Stalin had been increasingly raging conspiracies against doctors, who were predominantly Jewish, saying they were killing babies in hospitals. This hysteria led to Stalin avoiding advice from doctors and, most ironically, resulted in Stalin’s staff not calling a doctor to help when he collapsed from a stroke.
An appropriate endorsement of Schadenfreude.
Laura Marling
Well that last section was longer and more grim than intended. Let’s take a breath.
I was mindlessly scrolling Instagram one evening, doing an excellent job judging myself, and then, mid-bite of my mid-week lukewarm leftovers, I stumbled upon a post from Laura Marling about a songwriting course she’d be teaching in March. I immediately signed up. I felt like I struck gold snagging a spot in a class where I could interact with a songwriter I’ve admired for years. In reality, it was 500+ people on Zoom. But after laughing at the ridiculousness of my initial expectations, I quickly found the class to be a tremendous experience. It reinvigorated my songwriting, providing pragmatic creative advice and liberating the writing process by encouraging us to trust what arises. Laura has a way of making songwriting feel sacred — defending it as high-art — while also approachable.
Many people expressed how the course helped validate their own creative process, normalizing things that maybe they initially felt odd or insecure about (as if they weren’t doing it “right”). For me, this moment came when hearing Laura talk about how she generally first finds a song musically (what she called the “spine of a song”) and then spends time experimenting in that sonic realm to see what lyrics arise. This is often how I write. It can feel more like working on a painting, or a collage, rather than having some definitive narrative or story from the outset. Creating anything requires a certain trusting of your own instincts. And that trust only gets strengthened when you learn that someone you admire mirrors your own process.
A few takeaways from the class that I found particularly helpful and insightful:
Make time to be an observer. Read, go to galleries, experience art. Protect this observing as part of your creative practice.
There are periods of breathing in, periods of breathing out, and just as importantly, periods of doing nothing. You don’t need to be militaristic about writing every day. Be flexible in accepting the different rhythms of life: times of taking in inspiration, times of creating, times of turning off your brain. Laura had a great story about spending two entire weeks solely playing Zelda after finishing a recent album. At the end of a breath, you must trust that the next thing will come to you.
Narrativize the mundane. Refer to real and specific moments, steal actual conversations for lyrics. “Real life is where the stranger than fiction things happen so they say”.
Accept dissonance and tension - this is where the good creative energy comes from. Trying to grasp an answer too tightly is limiting, restricting.
Explore your “philosophical knots” - something you continue to be confronted with, sticky things you keep observing and reflecting on. Follow this path of curiosity.
(Laura’s pushback to the belief that making art must be all-consuming) Living is the priority, art is the consequence. Songwriting to her is the opportunity to “report from the field of the living”. Songwriting is only one of the pillars of her existence.
The course was taught through School of Song which I can’t recommend enough — they’re cultivating a unique caliber of instructors (particularly reputable indie songwriters & producers). I’ve already signed up for an upcoming guitar fingerpicking class.
Process informing substance
Connecting the book Nexus to a songwriting course by Laura Marling feels near impossible. But one thread is the intersection of AI and creativity. There’s much debate about whether AI will benefit or worsen the creative potential of humans — and I think the answer, like so many things, is: it depends.
While I’m all for equipping people with better tools to express themselves, I am worried about what happens when we remove too much “friction” from the process of making something. For example: staring at a blank page is painful, but it forces you to think and try to work things out through writing. It’s of course much easier to ask AI to make an initial draft, but then you might never arrive at the special place that your mind could have taken you if you gave it a chance. (For this exact reason I never use AI in this writing practice — but ya, I could probably use an editor)
On this point, it’s worth having a look at a recent point of view expressed by the CEO of an AI video creation company (Cristóbal Valenzuela of Runway).
Art is a translation problem.….This requires two things: a) an interesting point of view to communicate — an insight, feeling, or understanding that adds something meaningful to our collective experience, and b) the technical ability (craft) to shape that insight into a form others can access. That's art: a + b. Craft focuses on technical execution. Art uses craft as a vehicle for translating a message. When the medium lacks a point of view, it matters less. Consider the distinctions: poetry versus sentences, film versus video, painting versus pictures, dance versus movement. The difference lies in the presence of a clear point of view—a perspective, something worth translating for others to understand. Generative models and their impact on art making is that they can radically help with this translation problem. A new way to bridge the gap between vision and execution. AI can't help with how you see and experience the world. But it can aid you on translating it.
There’s much I likely agree with here and I respect what Runway is building. But what this perspective misses is that often it’s through craft (b) that a point of view (a) gets discovered. It’s not always simply that you have a vision and go execute it, but rather that the creative process itself helps to develop and inform your point of view. And so if craft becomes too seamless, or just viewed as a technical variable in some creative production function, I fear we’d begin to lose something meaningful in the inevitable messiness of human creativity.
A closing quote
Uncertainty is wisdom in motion - Maggie Jackson
Parting songs
The songwriting class involved “song share” breakouts with a smaller group of 3-4 participants. I was always impressive by the quality of songs and musicianship of fellow participants. Isabel Shaye stood out as a unique talent - and I learned she’ll be releasing an album soon. So this is a strong recommendation to follow along and check out her music. I’ve had her recent release I Found a Place on repeat.

Comments