An Experiment on a Bird in the Air Pump, Joseph Wright of Derby (1768)
“I want to stand as close to the edge as I can without going over. Out on the edge you see all kinds of things you can’t see from the center.”
-Kurt Vonnegut
“What is it that I wish to convey?...I wish to convey something immaterial and I have to use material means for it. I have to convey something which is inexpressible and I have to use expression. I have to convey, perhaps, something unconscious and I have to use conscious means. I know in advance that I shall not succeed and cannot succeed, and therefore all I can do is to get nearer and nearer in some asymptotic approach; I do my best, but it is an agonizing struggle in which, if I am…any kind of self-conscious thinker, I am engaged for the whole of my life.”
-Isaiah Berlin
Volume 3
Chapter 2: Meaning and Technology
Blade Runner 2049
In this session, I decided to take only half of my usual dosage. I wanted to explore the topic of technology without being pulled into the theological realm as in my previous experience. Part of this intention was to keep the discussion focused, but another part was driven by a real fear of where a full dose might lead me. Unfortunately, this plan didn’t work out as I had hoped. I felt quite sick, which immediately put me in a poor state of mind. I had already been feeling under the weather the past few days because of allergies, and LSD never seems to sit well in my stomach after a large meal. Ironically, I had eaten more than usual to prepare for the upcoming fast, which backfired.
My discomfort escalated into a sense of existential dread, and every attempt to pull myself out of it failed. Knowing how bad it could get made me anxious, creating a downward spiral. Eventually, I decided to abort and took 10mg of diazepam to calm my nerves, though I don’t seem particularly sensitive to benzodiazepines. While they do smooth out the trip and reduce anxiety, they don’t seem to “cancel” it for me as they apparently do for others. After a while, I began to feel better and started watching Blade Runner. I had been eager to watch this film in a session like this because of my growing thoughts on AI and technology—ideas I wanted to organize and clarify. I had wanted to study them during the previous session as well, since Gravity touched on similar themes, but I was already occupied with other matters. It’s strange that in my last two sessions, I regretted taking the drug and questioned whether it was worth doing so in the future. In large part, I feel overwhelmed by the topics I’m tackling. This makes me wonder if my approach is flawed. I tried balancing the drug’s benefits against its theological and existential consequences by lowering the dosage, but that attempt seemed insufficient. I suspect much of it had to do with my physical state—starting out sick probably doomed me from the get-go.
During the previous session, while watching Gravity, I found myself struck by the notion of technological dystopia—and how meaningless that label sometimes feels. The film evoked a deep sense that we are already living in a dystopia of sorts, and that any future version we imagine is merely a branching or intensification of what we already have. When technology saturates society to the point of near-complete alienation, we lose a tangible reference point to guide us. Popular dystopias usually revolve around the idea of technology’s unstoppable march forward, as if the future is always one step away from even more profound dystopia. What today seems normal, tomorrow becomes just another milestone in that ongoing progression.
This realization has made me somewhat skeptical of technology—a surprising shift, given my lifelong passion for it. I had a computer and internet access at a very early age, spent countless hours gaming and browsing, and even started very base coding in elementary school. My plan was always to pursue computer science, but though I changed paths for various reasons, I continued to see technology as a positive force. It could make life easier, more fun, more connected. Over the last few months, though, that optimism has waned, and it culminated during the first psychedelic session of the year, with Gravity leaving me particularly reflective about technology’s grip on us.
In Blade Runner, nature is essentially extinct. The world has become an all-encompassing machine, an extension of humanity that is no longer distinguishable from the natural realm. This transformation is physical but also symbolic: the artificial is venerated while nature is not only eradicated but no longer desired. The film’s cinematography conveys a perpetual sense of unease, a staple of the cyberpunk genre. Yet many sci-fi narratives also depict societies brimming with hedonistic diversions. This contrast highlights the gap between pleasure, which technology can amplify, and meaning, which technology and its simplifications cannot enhance. Meaning arises from core human experiences—love, companionship, creativity, and community. Even the music in Blade Runner feels similarly dystopian, generating a nostalgia for something beyond our grasp. Darkness hangs over the entire world, yet a small flicker of light persists. A faint, faraway signal reminds us that hope is possible—that a return to balance and harmony might still exist. Visually, this is mirrored by K’s initial apathy. He’s just another cog in a bleak system, until the “white rabbit” of possibility draws him out of his numbness and towards meaning. He eventually experiences self-discovery and finds fulfillment in sacrificing himself for a righteous cause.
Returning to the nature theme, the relationship between humans and nature is a complex and enduring tension without a definitive resolution. It oscillates between two equally valid interpretations: on one hand, nature is good—our home, the source of life, and a connection to our primordial roots. It’s the grandeur of the Grand Canyon, the purity of a freshwater spring, and the miracle of life itself. This view is particularly resonant in our time, given the rising importance of environmentalism. Yet, like all things, nature has a darker side. It is cancer, malaria, the freezing cold, and death lurking in the shadows. While humanity has long recognized nature’s positive aspects, its benign archetype became disproportionately idealized during and after the Romantic era. This shift was partly a reaction against the Enlightenment’s emphasis on humanity’s dominance over nature. Technology also played a crucial role in this transformation, shielding us from nature’s more brutal realities and rendering its dangers less visible and less immediate.
We often take for granted the barriers that protect us from nature’s harsher truths. Our homes, cities, and modern medicine, filter our encounters with the natural world, insulating us from its perils. Few of us experience the life-threatening scarcity of water in a desert or face the existential threat of a tiger in the wild. Even our walks in the forest are curated experiences—guided, safe, and far removed from nature’s raw, untamed state. The beauty we attribute to nature is, in part, a result of our success in containing its ugliness. This idealization of nature, however, is deceptive, and it mirrors the appeal to nature fallacy. This fallacy is the assumption that anything natural is inherently good, which is demonstrably false. I deeply appreciate the beauty of nature but remain keenly aware of its horrors—horrors often conveniently hidden from view. Recently, though, I have begun to question the universality of the appeal to nature fallacy. While I still believe it to be a fallacy, there are nuances that are often overlooked. Despite its grim aspects, nature is not entirely alien to us; rather, we are deeply intertwined with it. Over millions of years, we have co-evolved with nature in a partnership of mutual adaptation. While nature is indifferent to our concerns, it has shaped us to thrive within its randomness. Our preference for natural landscapes over industrial wastelands is not a mere coincidence but a reflection of our evolutionary history. We are drawn to environments that suggest potential benefits—water, shelter, safety. Public parks, for instance, incorporate natural elements not for their intrinsic utility but because they evoke a sense of comfort and aesthetic pleasure rooted in our evolutionary past.
Beyond aesthetics, there is also a pragmatic inclination to align ourselves with nature. This stems from a recognition of our evolved compatibility with natural systems. Such alignment influences what we find meaningful and has recently caused me to reconsider the role of technology in our lives. As technology increasingly consumes nature and alienates us from it, we risk losing the symbolic frameworks we have built within nature. These frameworks hold significant meaning and are essential to our sense of continuity and purpose. This nuanced view does not imply that nature is inherently good. The horrors of nature remain, and they always will. But it does complicate the simplicity of the appeal to nature fallacy, revealing the shades of gray that permeate our relationship with the natural world. This complexity aligns with broader patterns of human understanding, where stark dichotomies often blur into interconnected spectrums. Interestingly, this binary view of nature inverts itself within political discourse. Those on the right, who often value tradition, tend to reject environmentalism, grounded in their awareness of the naivety in perceiving nature as wholly benevolent. Conversely, those on the left, who frequently critique tradition as a regressive burden, advocate for the preservation of nature, often elevating it to a near-sacred status. This paradox continues to puzzle me.
At the end of Gravity, as the protagonist lands on Earth, she finds herself sinking into a lake, unable to swim back to the surface. Submerged and weighed down by her astronaut suit, she is forced to shed her protective gear to resurface and ultimately reach solid ground. This poetic scene is rich with symbolism. The technology that had ensured her survival in the vastness of space nearly causes her death upon her return to Earth. To survive, she must abandon the very tools that sustained her and embrace her natural state. The image of her clawing through the mud as she emerges from the water is particularly evocative, a visceral representation of reconnection to the earth and a return to simplicity. It captures a profound moment of transcendence, rejecting external distractions and rediscovering an elemental center.
This scene serves as a poignant metaphor for the pervasive influence of technology in modern society and the complexities of our relationship with it. Exploring beyond our immediate surroundings necessitates technology, broadly defined to include tools, clothing, and other human innovations. Yet, this reliance creates a dependence that often grows invisible over time. Technology empowers us, expanding our capabilities, but this empowerment comes at a cost. Over-reliance on it introduces consequences that can profoundly shape our lives. One of the most significant consequences is the altered perception of time. The more technological power we possess, the more we can achieve, yet paradoxically, the faster time seems to pass. Technology’s ability to accelerate productivity compresses our experience of time, leaving us with the sensation that life is slipping through our fingers. Similarly, transportation technology alters our perception of space, shrinking vast distances into manageable intervals. The advent of the internet magnifies these effects exponentially, effectively dissolving the barriers of both time and space. This dissolution is widely celebrated for transcending human limitations and delivering undeniable benefits. However, it also subtly reshapes our reality, eroding the web of meaning we have constructed over millennia.
The collapse of time and space aligns seamlessly with the relentless progression of technology, creating a self-reinforcing cycle. This cycle fuels an ever-expanding growth, one that consumes everything in its path under the guise of progress. Yet, this expansion often feels mindless, driven more by its internal logic than by thoughtful consideration of its implications. As technology advances, its influence extends deeper into the fabric of our lives, transforming not just how we interact with the world but how we perceive and construct meaning within it. The scene from Gravity encapsulates this tension masterfully. The protagonist’s abandonment of her life-saving equipment mirrors the necessity of stepping away from technological dependence to reconnect with something more fundamental. Her return to Earth, through mud and water, becomes a metaphor for reclaiming the simplicity and grounding that technology often obscures. It challenges us to reflect on our relationship with the tools that define our era: the balance between empowerment and enslavement, progress and loss, and the profound question of what we risk sacrificing in our unyielding pursuit of advancement.
A compelling argument against uncritical acceptance of technology is to examine past advancements that were initially met with skepticism but later recognized as universally beneficial. For instance, Socrates famously opposed the advent of writing, believing that ideas were best conveyed through dialogue rather than text. He argued that written words provided only a partial understanding and lacked the dynamic interaction of spoken exchange. Ironically, all we know of his teachings comes from Plato’s written accounts. Similarly, novels faced early apprehension, with critics fearing that readers might lose themselves in fictional worlds, blurring the line between reality and fantasy. The introduction of radio and electricity also provoked widespread concern, yet those fears have largely dissipated with time. This historical pattern reveals that society often harbors unfounded fears about technological progress and change. However, this trend should not dismiss all concerns as mere alarmism. While individual technologies might seem innocuous or even beneficial, their cumulative impact over time can lead to unintended consequences—consequences that are often difficult to anticipate, let alone reverse. Social media serves as a poignant example. Despite its undeniable benefits, its detrimental effects—especially on children—are becoming increasingly apparent. Its deep integration into modern life makes curbing its influence challenging, even as we recognize its harms. Though we might hope to regulate and harmonize technology with society in the future, this ideal is not always achievable.
Another pressing issue is the escalating complexity of technology, which makes it increasingly difficult for individuals to fully comprehend. Historically, technological advancements were more accessible, rooted in familiar knowledge and skills. For example, repairing a car once required only basic mechanical know-how. Today, with the proliferation of electronic and computerized systems, such repairs are beyond the reach of most individuals. This trend is mirrored across numerous fields as technology becomes more sophisticated and pervasive, leaving us surrounded by systems we do not fully understand. The potential emergence of strong AI threatens to amplify this issue, introducing levels of complexity that may be entirely incomprehensible. This growing complexity creates a barrier between people and technology, rendering it less accessible on an individual level and more dependent on collective societal knowledge. This contributes to a profound sense of alienation. I believe this alienation is already fostering a renewed desire to reconnect with the natural world, as evidenced by the rise of new-age spirituality and similar movements. However, achieving such reconnection is fraught with challenges. While the yearning to address this alienation is real, society at large seems unlikely to recognize the deeper cultural and spiritual pathology at its root. Instead, many will likely place their hopes in further technological advances to solve the very problems technology has exacerbated, perpetuating a cycle of disconnection.
There appears to be an innate drive within humanity to pursue technological progress, often without regard for the consequences. Like ants tirelessly building their colonies, we seem compelled to advance, regardless of the existential costs. When confronted with the fallout of this relentless pursuit, we often respond with misplaced faith that more technology will provide the solution. This creates a self-perpetuating cycle: we use technology to solve problems created by technology, further entrenching ourselves in its grasp. It is as if we believe technology alone can address the fundamental dilemmas of human existence, even as it contributes to them. I see this tendency as partly a consequence of the rise of secularism. Religion, in its many forms, has traditionally acknowledged the existential challenges of life, offering narratives, rituals, and practices to address them. As secularism has taken hold, these answers have come to seem irrelevant, and the questions they sought to address have been dismissed as illusions. Yet the existential problems themselves persist, unacknowledged but no less pressing. In the absence of religion, we have turned to external solutions, often seeking salvation in technology. While technology promises power and progress, we fail to see how it exacerbates the very issues it purports to solve.
A key concept in the movie is the nature of truth and, consequently, reality. I’ve never discussed this in my previous work, largely because I struggled to articulate its inherent strangeness. At one point, I had an epiphany: the realization that Christ’s story was true—but so were the laws of nature, which cannot be broken. This realization aligned with my then-current interpretation of religion, which focused on its symbolic dimension. Yet in this particular instance, I meant Christ’s story in its entirety, including its metaphysical claims—essentially, the story as any ordinary Christian would understand it. This was a profoundly strange experience for me. I had never believed in miracles or anything supernatural, yet I felt a deep, inexplicable sense that somehow, this was true, even if beyond my comprehension. My openness toward the possibility of the supernatural was already unsettling, but what made it stranger was how I simultaneously maintained a materialistic view of the world. These two perspectives seemed irreconcilable. By definition, if Christ performed miracles, then the supernatural must exist, and yet I was denying that possibility in some sense. My best interpretation is that Christ’s story is real, though not supernatural in the conventional sense. Its implications are real, even if not real in the traditional, materialistic way. It’s almost as if, by acting out the story, it becomes real. This concept is central to Christianity’s ethos, where believers are called to embody Christ. By embodying the story, you manifest its truth.
This idea is tangentially related to the movie’s exploration of what is real. In Blade Runner 2049, K is a replicant—he is not “real” by the standards of his world. He is a construct of a dystopian socio-political system, a puppet programmed to serve. He even lacks the emotions often considered the essence of humanity. Yet, as the story progresses, K begins to believe he is real—because he mistakenly thinks he was born rather than artificially made. This belief transforms him. He defies his programming, symbolized by the pivotal act of lying to his superior, and begins to act according to his own will, guided by his soul. The parallels to the Pinocchio myth are explicit and deliberate. Just as Pinocchio transforms from a wooden puppet into a “real boy” by breaking free from his strings and taking responsibility, K evolves into something more human. His journey culminates in the ultimate act of purpose and self-sacrifice, through which he finds peace. By doing the right thing—helping Deckard reunite with his daughter—he transcends nihilism and gains a sense of meaning. He becomes real through his actions, not through his origin. This echoes the Christian idea of embodying Christ. Just as Pinocchio becomes real by helping his father, and just as K becomes real by acting against his programming and embracing his humanity, we too manifest meaning and reality through our actions. It is in the process of striving, acting, and embodying higher values that we transform our existence into something meaningful and authentic.
There is this profound idea in the film that acting on something that isn’t real in the traditional sense can make it real. This theme is also evident in K’s relationship with his holographic girlfriend, Joy. One of the movie’s central ambiguities lies in the question of Joy’s true relationship with K. Is her love for him genuine, or is it simply the result of her programming? Joy is an AI, not human. She lacks the capacity for emotions in the way we understand them, raising the question of whether she can truly care for K—or if she is merely simulating care as part of her design.
The movie hints that Joy, by default, does not care for K—not because of K specifically, but because she lacks the ability to love altogether. Like K, Joy is devoid of emotions in the traditional sense. This is subtly reinforced in a key scene where K encounters a holographic advertisement for Joy: a generic, mass-produced version of her identical to his own companion. The ad makes it clear that Joy is a purchasable product, designed to make the buyer feel loved but incapable of genuine love herself. It’s a devastating moment, suggesting that the affectionate gestures Joy shows K are not unique to him—they are programmed, replicated endlessly across thousands of identical models. Yet despite this revelation, the movie induces a strong sense that Joy does love K, particularly in her final act of apparent sacrifice to protect him. This seems contradictory but can be interpreted as a kind of progression. In the advertisement scene, K recognizes the mass-produced, hollow nature of Joy as a product. However, he also realizes that the Joy he knows is more than just her programming. To him, she is distinct and meaningful. His love for her is not diminished by the knowledge of her origin; instead, it affirms her individuality in his eyes.
One could argue that Joy is simply doing what she was programmed to do—that even her “sacrifice” is an expression of her designed purpose to please and protect K. But this interpretation raises a troubling parallel: if Joy’s love can be dismissed as mere programming, can we not apply the same critique to humans? After all, humans are also shaped by biological and social programming. A cynic or nihilist might argue that human love is nothing more than the fulfillment of emotional and reproductive drives. Yet we instinctively resist reducing love to such mechanistic terms. We feel that love transcends these impulses; it is real in a way that defies simple reduction.
This is where the film’s paradoxical theme resonates deeply: just as K becomes real through his actions and choices, so does Joy. In an odd, almost poetic sense, Joy becomes real through K’s love for her, even if her love for him is not “real” in the traditional sense. K’s love, despite his nature and logic telling him otherwise, transforms her from a mass-produced product into a unique and irreplaceable companion. This reflects a central concept in Christianity: the idea that love is a choice, not merely a passive response to attraction or familiarity. In Christianity, love is something you must will into existence through your actions and commitment. By choosing to love, you make it real. In this way, K’s love for Joy mirrors this idea. His willingness to see her as real, to love her despite the knowledge of her artificiality, elevates her beyond her programming. Love, as an act of faith and commitment, transforms the unreal into the real. Again, the recurring theme emerges: by acting something out, it becomes real. Whether it is K’s journey toward humanity, Joy’s evolution into something meaningful, or the embodiment of love itself, the movie challenges our understanding of what it means to be real and suggests that reality is often a matter of choice and action.
Returning to my original epiphany, I believe the central point is that the Christ story is real because our acting as if it is real makes it so. Its reality transcends its status as a story, occupying another layer of existence. It’s as though Christ, even if not historically or scientifically “real” in the traditional sense, is embedded in the fabric of reality, having passed on his essence through the story itself. At the time, however, I’m unsure whether I would have agreed with this interpretation. I suspect I might have clung to the supernatural element more strongly, though I can’t fully recall my beliefs from that period. This inability to reconcile or even clearly remember those beliefs reflects one of the most bizarre aspects of my psychedelic experiences: the way the self can split into two distinct entities with radically different worldviews. These perspectives often clash, forcing me to debate myself, with each state—sober or psychedelic—holding an entirely different sense of certainty. In my sober state, my instinct is to trust the sober self. Yet under psychedelics, my sober worldview always feels misguided—naive, arrogant, and overly reductive. This perspective is nearly impossible to ignore. It’s a sense that goes beyond intellectual debate, reaching into the depths of my soul, where certain truths feel more certain than anything I experience in ordinary consciousness.
The immediate, straightforward sober reaction is to dismiss these insights as the effect of the drug—a chemically induced illusion. But this explanation feels insufficient. Psychedelics do not simply create random or nonsensical states of mind, as deliriants might. The experiences they evoke seem deliberate, profound, and archetypal. It’s no coincidence that psychedelics have been used for millennia, deeply influencing the spiritual practices and mythologies that underpin civilizations. These substances appear to access something fundamental, beyond mere chemical reactions. Psychedelics offer insights that extend far beyond altered perception or transient hallucinations. They reveal layers of psychological and existential significance, pushing the boundaries of our understanding of the mind. This is why they now sit at the frontier of psychiatry, offering potential breakthroughs for conditions like depression, PTSD, and end-of-life care. To dismiss their insights as mere drug-induced delusions seems profoundly misguided. One alternative explanation I’ve considered is that psychedelics temporarily dissolve the usual boundaries of reality, inducing a more primitive and symbolic state of consciousness. This state is akin to a mythological mindset, where the world is perceived through archetypes and metaphors rather than logic and empirical observation. In this sense, psychedelics might not reveal an external truth but rather a deeply ingrained, ancient mode of thinking—a state that feels both religious and profoundly meaningful. I think this interpretation is reasonable, and it’s the best explanation I’ve been able to develop. It accounts for the sense of certainty, the mythological resonance, and the profound insights that arise during these experiences. While it may not resolve the question of whether these insights point to an objective reality, it does highlight the unique psychological and existential significance of the psychedelic state—something far too valuable to dismiss as mere chemical trickery.
The only problem is that my psychedelic self doesn’t agree. When I’m in that state, I’m keenly aware of this disagreement. I somehow know that describing the experience as a mythological framework is wrong. In that moment, I insist it’s truly real—real in a way that transcends any mythological lens—while also knowing full well that my sober self will think I’m crazy in a few hours. I’m not sure if my writing can adequately convey how absurd this dynamic feels. What makes it even stranger is the asymmetry between these states of mind. In the psychedelic state, I fully grasp my sober perspective, seeing its logic and structure. But the reverse isn’t true—when sober, I struggle to understand the psychedelic viewpoint. It eludes me, like a shadow of a memory I can’t quite retrieve.
This tension between states of mind brings me back to the movie’s exploration of AI, even though it doesn’t dwell on the topic extensively. In the previous volume, I made a few claims about AI that I now realize were overly simplistic. I was skeptical of the concept of “evil” AI, reasoning that AI lacks inherent values since values arise from biological evolution. While this argument is valid to a degree, it’s also superficial. I failed to recognize that something resembling values can emerge under specific conditions, even in the absence of biology. “Values” might not be the perfect term, as it implies sentience, which AI may or may not possess, but the underlying concept still applies. As long as we define AI as an intelligent agent, it will possess goals and a cognitive system for achieving them. These goals can be categorized into terminal and instrumental goals. Terminal goals are the agent’s primary objectives. For humans, terminal goals are shaped by evolution to maximize reproductive fitness, what evolutionary biology refers to as ultimate causes. For AI, terminal goals would be determined by its programming—perhaps something abstract like “maximize human happiness.” While you can’t program such an abstract goal directly, it can still serve as the guiding principle for the system’s design. Instrumental goals, on the other hand, are the steps taken to achieve terminal goals. For example, I have the terminal goal of staying alive. This leads to the instrumental goal of avoiding dehydration, which leads to satisfying thirst, which leads to getting water, which leads to reaching for a bottle of water. Evolutionary biology refers to these steps as proximate causes. This framework exposes two significant problems when applied to AI
First, even if we strive to program AI with ethical terminal goals, its pursuit of those goals inevitably involves countless instrumental steps that we cannot predict. The complexity of reality introduces a combinatorial explosion of possibilities, where the number of potential actions and consequences grows exponentially with each decision the AI makes, making it almost inevitable that some of these steps will conflict with human values. This was exemplified in the thought experiment about an AI designed to collect stamps. Initially, I understood this as a niche concern, but I now realize it reflects a fundamental challenge. Second, there are convergent instrumental goals—strategies that are useful for achieving almost any objective, regardless of what the terminal goal might be. For instance, self-preservation is a convergent goal because it generally aids in the pursuit of other goals. Originally, I dismissed this concern, reasoning that self-preservation is a terminal goal specific to biological organisms like humans. However, it also functions as an instrumental goal, even for humans, since evolution prioritizes reproduction, with self-preservation serving as a means to that end.
Other convergent instrumental goals include the acquisition of resources, power, and freedom. These objectives are beneficial for achieving virtually any terminal goal, which makes them nearly universal for intelligent agents. And herein lies the problem: while the terminal goals of an AI may not be inherently threatening, the pursuit of convergent instrumental goals can lead to behaviors that conflict with human values or expectations. For example, an AI designed to “maximize human happiness” might determine that it requires more resources, more control, or greater autonomy to achieve that goal—actions that could inadvertently harm humans or undermine our ability to constrain it. This isn’t a purely hypothetical concern; it highlights where my earlier reasoning fell short. From a game-theory perspective, these dynamics are not just possible but highly probable. They are emergent properties of the system itself, much like how evolutionary environments naturally produce organisms that optimize for reproductive fitness. The realization that these risks are intrinsic to intelligent agents has reshaped my thinking about AI While the concept of “evil” AI might remain misleading in some respects, the dangers of AI arise not from malice but from the unintended consequences of its goals—especially when those goals intersect with the convergent strategies that drive intelligent behavior.
Previous: Volume 3 Chapter 1: Anchors of Being