Like any good sci-fi, The Cytocorp Saga takes elements of our present and projects them, perhaps cynically, into the future.
Recently, I ran across two separate TED talks that speak to themes in my books. But they’re also related to a question that’s been bothering me lately:
Why did the bell curve invert?
Until recently, most people fell somewhere in the middle of an issue. Now, it seems we are crowded to the sides — what used to be the fringe. To paraphrase Yeats, the center did not hold.
I don’t think politics created this inversion, but I do think it’s exploiting it. The us-versus-them dichotomy was already here — we just didn’t know it.
How did this happen?
Reason 1: Content Became Like a Drug
The problem with most drugs is resistance — the more you take, the more you need for the same effect. Content works this way, too.
Clicks and likes feed complex algorithms that have a singular purpose: To keep you in the platform (Facebook, YouTube, etc.) for as long as possible by drawing you deeper, not unlike what writers try to do with their stories. The algorithm makes no judgment about the content itself.
Over time, the algorithms discover and exploit correlations. Maybe people who like Steven Seagal also like Axe Body Spray (I’m guessing). Or maybe people who like TED talks about dystopias prefer Target to Wal-Mart.
Much of this is a good thing. For example, the study of prevention science uses mathematical models of behavior to identify at-risk youth before it’s too late for interventions to work. Bullying, for example.
Algorithms also save us time by helping us zero in on the most relevant content. But what you click correlates to what you’re shown, and when we’re shown something that causes joy or outrage, we get a dopamine hit either way, and down the rathole we go. Eventually, you are only exposed to your “side” of an issue as part of a big confirmation-bias machine. Essentially, that’s what the internet now is.
In The Cytocorp Saga, Cytocorp has applied this model to social engineering, using an AI to pair a person’s aptitude, intelligence, and personality with their role in the community. The way they see it, humankind’s choices over time led to society’s collapse, therefore an AI will make better decisions than we can for ourselves.
Anyway, the first is titled, “We’re building a dystopia just to make people click on ads,” and it’s from a technosociologist named Zeynep Tufekci. In it, she describes how recommendation algorithms are designed to draw us deeper into content and ways of thinking that push us further toward the poles.
Reason 2: We Aren’t Directing Our Attention
I’m a big believer, if not always a practitioner, of mindfulness. In a nutshell, it means being the master of your own attention.
These days, our attention is pulled like taffy in all different directions. Each time we give it to a piece of content, an algorithm learns from us and becomes smarter.
This idea of behavioral data feeding an ever-evolving AI is very germane to The Cytocorp Saga, particularly books 2 and 3. If companies are already this good at pushing our buttons, imagine how good they’ll be in 160 years. Is it so hard to imagine an algorithm knowing you better than you know yourself?
The second video, then, is titled, “How a handful of tech companies control billions of minds every day” by a design thinker named Tristan Harris, a Stanford-educated man who worked in something called the Persuasive Technology Lab. In this video, he describes how tech companies’ deep knowledge of human psychology is applied to earn and keep our attention.
The Path Forward
I don’t know where all this is headed, but my guess is that it will get a lot worse before it gets a little better. Algorithms will get smarter and it we be harder and harder to tell if we are really directing our attention or if it’s subtly being directed for us.
For now, I think it’s important for us to be aware of these influences. The more outraged we become about the “other,” the less likely we are to question what we read and see. We used to be crowded in the middle of the bell curve because we shared a lot of the same values. We still actually do, but as long as it feels like we’re in a “battle,” we’ll have no outlet for our feeble rage but to keep clicking.