The Dopamine Era: How the Attention Economy Actually Works On You
Part 2 — The Machine
The algorithms aren’t trying to inform you. They’re not even trying to entertain you. They’re trying to activate you.
That difference is why you can spend two hours on your phone and come away feeling like you should have accomplished something when you accomplished nothing.
In Part 1, I told you about the girl who spent her childhood in the library, stacking books so high she could only carry them with her chin and the full length of her arms. That girl grew up to major in English Literature in college, which is basically a degree in reading, writing, and thinking.
It wasn’t uncommon to have 500 to 1,000 pages of reading each week, plus papers due, plus classroom discussions where you had to demonstrate that you’d actually done the reading and thought about it.
I did it because I loved it. I didn’t stop to consider what that degree might lead to. I followed my interest. And I went deep.
My favorite place to be is deep inside my own head, analyzing the words of others and their implications. Our tests were largely essay tests. Maybe you remember those blue books with the blank wide-ruled notebook paper stapled inside.
At the beginning of the exam, you got a blue book and about two hours to answer some sort of question. Maybe it was a compare and contrast. Maybe it was finding a rationale for a worldview. You never knew until you got there.
While a lot of people preferred multiple choice, I crushed essay tests. I loved them. Finally there was a space where I could play, where I could take random concepts and shape them into meaning.
If we had two hours for the exam, I would spend almost half that time using scratch paper to organize my thoughts. What emerged was a detailed outline, my ideas, and the possible supports for each.
While my classmates were frantically writing in the little blue books, I was still thinking and planning. It never occurred to me to be worried that I was the only one not moving my pencil.
But once I had my thoughts organized, the essay wrote itself. It was just execution.
When I look back over my life and multiple careers, I see this dynamic very much in play. I’ve always been able to think critically about a topic, find support or find something to dispute, and present my thoughts with clarity and reason.
That capacity didn’t disappear because I got older. But it was engineered away about 15 years ago.
The business model you’re funding with your eyeballs.
The platforms that dominate your attention don’t profit from your growth. They profit from your time. The longer you stay, the more ads they serve. The more emotionally activated you are, the longer you stay.
This isn’t a bug. It’s the entire business model.
Shoshana Zuboff, a Harvard professor who spent years studying this, calls it “surveillance capitalism.” Your behavior — what you click, how long you linger, what makes you react — is harvested as raw material. It’s refined into predictions about what will keep you engaged. Then it’s sold to advertisers who want access to your attention.
You are not the customer. You are the product.
Tristan Harris used to be a design ethicist at Google. He’s now one of the loudest voices warning us about what these systems do. And he’s clear that this was intentional.
Variable reward schedules, which is the same psychological mechanism that makes slot machines addictive, were deliberately built into these platforms.
If you got a reward every single time you performed an action, you’d eventually get bored. The predictability would wear off.
But if the reward comes sometimes, and you never know when, you keep pulling the lever.
That uncertainty is the hook.
This is what they built to hijack the dopamine lever I talked about in Part 1.
Slot machines don’t pay out on a predictable schedule, and that’s precisely why people sit in front of them for hours.
Your social media feed works the same way.
Sometimes you scroll and find something genuinely interesting or funny or useful. Sometimes you don’t.
But you never know which scroll will deliver, so you keep going. Add in autoplay and red notification badges and you have a real dopamine winner.
The possibility of reward becomes more compelling than the reward itself. And unlike a slot machine, you don’t have to drive to a casino. It’s right there in your pocket.

None of this was accidental.
These systems were designed by people who understood behavioral psychology and used it on purpose. And then they added to their knowledge the data we freely gave them in the form of likes, shares, comments, and dwell time. This gave them a clear picture of how to pivot the reward machine.
Your nervous system running on cortisol isn’t a side effect. It’s the product working as designed.
In 2014, Facebook published a study that should have sparked outrage. I know it did for me when I heard about it in the car on my way to work. I remember my blood boiling when I was making a left turn.
For one week, they manipulated the news feeds of nearly 700,000 Facebook users without telling them.
Some users were shown more positive content. Others were shown more negative content.
The result, shockingly: people who saw more negative posts wrote more negative posts themselves. People who saw more positive content wrote more positively.
Emotional states transferred through the platform at scale. And Facebook proved they knew it.
The backlash wasn’t really about the finding. It was that they ran a psychological experiment on hundreds of thousands of people who never consented to being test subjects. That’s research ethics 101.
They understood exactly what they were doing to people’s emotional states. And they kept doing it anyway. And that was before the Cambridge Analytica scandal that revealed how they were selling the information we were giving them.
Their intent was clear early on.
The algorithm doesn’t care about truth.
The algorithm’s job isn’t to evaluate whether something is true. It evaluates whether you’ll react with a metric, such as a like, a comment, a share or how long you linger on the information.
It takes that reaction from you and predicts what other kind of content might get that same reaction from you and serves it up.
Accuracy is completely irrelevant to the business model.
This is what trips up serious online content creators. The facts don’t stand alone unless you create a reaction that causes someone to look in the first place. It’s not enough to be able to share the truth.
The car wreck is irrelevant. It’s the looky-loos that matter.
Neil Postman warned about something adjacent to this 40 years ago in his book Amusing Ourselves to Death. He said the danger wasn’t that we’d be lied to. The danger was that the truth would be “drowned in a sea of irrelevance.” We wouldn’t be forbidden from knowing what matters.
We just wouldn’t be able to find it in the noise. That sound familiar?
“Now… this.”
Postman identified a phenomenon he called “Now… this.” It’s the moment when a news anchor reports something devastating like a war, a disaster, or a tragedy, and then cheerfully pivots: “Now… this,” segueing into a celebrity story or a commercial for beer.
That juxtaposition renders everything equally weightless, so nothing is allowed to land. It’s hard to know what to take seriously in this dynamic.
That sequencing has become the grammar of your entire feed. Scroll through it right now if you want proof.
- War footage.
- Then a meme.
- Then a recipe.
- Then outrage about something a politician said.
- Then an ad for skincare.
Each thing cancels the one before it.
You feel like you’re consuming information, but nothing lodges. Nothing demands a response you’re actually capable of giving.
Maybe you’ve said what I’ve said: I just can’t care about everything.
Postman wrote that the news gives us “information that gives us something to talk about but cannot lead to any meaningful action.” Opinions about which we can do nothing except, ironically, offer them as more news.
In this current climate, we’ve all become broadcasters of news, but rarely does it inform or move someone to true action. It just puts more “news” into the atmosphere for others to have to deal with.
Being on social media now feels like being in Times Square. You’ve got billboards with flashy ads, bright lights to make sure you keep walking quickly down the street, and people walking across your path to hand you a flyer you don’t need.
I came to New York to go to the museums, not to your comedy club, thank you.
But the bombardment isn’t just exhausting. It’s creating a responsibility in you that you may not even own.
Trained to care about things that don’t belong to you.
Here’s the part that might be hardest to see, because it’s so normalized.
The attention economy doesn’t just steal your time. It manufactures problems you didn’t have, then sells you solutions. And in the process, it trains you to feel anxious about things that aren’t yours to carry.
One of the things I hate about Instagram is how it markets fear of aging to me. I’m 55 years old. Every other ad is about the travails of menopause or how to stay young looking. At my age, that’s a fair target demographic.
But recently I saw an ad for a product that was basically a little stick with a cushion at the end, filled with tiny needles. A home version of microneedling.
The ad told me my moisturizer isn’t doing any good. In order for it to work, I need to poke holes in my skin so the product goes deeper. Wat??
Why would I stick little needles in my face at home? What if — even at 55 — I’m actually fine with how my face looks?
But here’s what really got me. As ridiculous as that ad was, as immediately as I snarled when I saw it, a small part of me wondered: well, am I doing enough for my skin? Me, with a nine-step morning and evening skincare routine.
That’s the machine working.
It preys on fear of aging, something we’ll all do if we’re lucky. And it makes me feel like I’m behind. Lost in all of it is any actual conversation about what getting older means. There’s no discourse about living a life of legacy. Just what you look like while you’re doing it.
The Royal Society for Public Health in the UK found that Instagram consistently ranks as the worst social media platform for mental health, particularly among young women (sounds like older ones, too). The mechanism is comparison with idealized images that trigger feelings of inadequacy.
On top of feeling insecure about yourself, you can now have strong opinions about crises happening across the world that you can’t affect. You feel urgency about problems you will never solve.
You’ve been trained to care about whether you’re aging correctly, or whether your life measures up to curated highlights you were never supposed to see.
Add to that wondering if the world has gone irreparably mad with crime, corruption, and war.
The feeling of anxiety isn’t a sign that something is wrong with you. It’s a sign the system is working exactly as designed.
There’s a spiritual cost we’ll explore more in Part 3. This all isn’t just bad for your mental health. It’s a distortion of how you were meant to live, created in the image of God.
We’ve largely abandoned the idea that the spiritual part of us even needs tending, that it requires the same kind of deliberate attention we’d give to our physical health or our careers.
And when we neglect that part of ourselves, we’re left with only our own cognitive resources to make sense of our lives and fight our way forward. That’s an exhausting way to live. It’s also not what we were built for.
You were never designed to carry the weight of the world’s problems while also being told your face isn’t good enough.

The overwhelm is the point.
The other day I grabbed my phone to check for an email I was waiting for. I found myself an hour later looking at four completely different things. It’s hard not to shame yourself for that.
I often ask myself how much farther along I’d be in creating my body of work if I didn’t have this element in my life. If I wasn’t constantly feeling the pull to look at the next thing.
Learning felt like a natural extension of my interest back in those blue-book days. Now it feels forced.
The internet went from being an exciting place to find research to an onslaught of information that may or may not be relevant.
We used to perform analog library work with a pocket full of change for photocopies.
The internet made information come to us easier. But our qualifying mechanism is now overwhelmed.
We didn’t get training on how to use this flood of information. We just got bombarded. It’s harder now to make decisions about what’s relevant because everything is presented with the same urgency at the same time.
Gloria Mark, a professor at UC Irvine who has studied attention for decades, found that the average time people spend on a single screen before switching dropped from two and a half minutes in 2004 to 47 seconds today.
And it takes an average of 25 minutes to return to your original task after a single interruption.
Your brain adapts to what you train it for. Every context-switch strengthens the neural pathways for more context-switching. The “too long, didn’t read” response is a trained incapacity.
But neuroplasticity works both ways. If your brain can be trained for fragmentation, it can be retrained for depth. That’s where we’re headed in this series. But first, you have to see the pattern.
Retraining is possible, but first we need to understand what we’re up against.
Huxley’s warning.
Postman spent much of Amusing Ourselves to Death contrasting two visions of dystopia. George Orwell’s 1984 imagined a government that would ban truth, burn books, and control information through force.
Aldous Huxley’s Brave New World imagined something different. A society so saturated with amusement that no one would want truth anymore. The truth wouldn’t be forbidden. It would just be boring compared to everything else.
“In the Huxleyan prophecy,” Postman wrote, “Big Brother does not watch us, by his choice. We watch him, by ours.”
And here’s a haunting thought: “What afflicted the people in Brave New World was not that they were laughing instead of thinking, but that they did not know what they were laughing about and why they had stopped thinking.”
Here’s the grace note.
Lest you think you’re a lazy, undisciplined soul who will never get your act together, I’ll offer you this.
You didn’t design this system. You were subjected to it.
The algorithms were built by teams of engineers using the most sophisticated behavioral psychology available really only to them. It was tested and refined on billions of people before they ever reached you.
You’re not weak or lazy for getting caught in it. You’re human.
You were just trying to learn more about the world and connect with others. That’s a good thing.
But now I hope you see some of the origins of your frustrations. I see it, too, and I feel it. I have to eat my own dog food in this same space.
And seeing it is the first step toward doing something different.
What I want you to do this week.
Spend one day tracking every time you pick up your phone and what triggered it. You’re not looking to shame yourself. But it’s important to see the pattern and your triggers.
Screen time isn’t the best indicator because it just measures one thing. You need to see data points.
When you realize you’ve been in a rabbit hole, write down:
- The time
- What triggered the pickup (boredom, a notification, anxiety, habit, avoiding something?)
- What you actually did once you were on the platform
- How long you stayed
- How you felt after
The goal isn’t behavior change yet. The goal is awareness. You can’t fight what you can’t see. Get a good picture of your actual behaviors.
Now that you see the machine, the harder question comes next.
What has it already taken from you?
- The book you haven’t written.
- The business you haven’t started.
- The prayer life that used to sustain you.
- The workouts that could change your health.
Part 3 isn’t about the machine anymore. It’s about the cost.
And it may be higher than you think.
Read Part 1 – The Dopamine Era: How Your Brain, Body, and Spirit Got Hijacked
Sources
Mark, G. (2023). Attention Span: A Groundbreaking Way to Restore Balance, Happiness and Productivity. Hanover Square Press. Dr. Mark’s research at UC Irvine documents the decline in sustained attention and the cognitive costs of constant interruption.
Twenge, J. M. (2017). iGen: Why Today’s Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy — and Completely Unprepared for Adulthood. Atria Books. Twenge’s research correlates the rise of smartphone ownership with increases in teen anxiety and depression beginning around 2012.
Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs. Zuboff’s landmark work explains how tech companies harvest human behavior as raw material for prediction products sold to advertisers.
Harris, T. Center for Humane Technology. Former Google design ethicist whose work exposing manipulative design practices in social media was featured in the documentary The Social Dilemma (2020).
Royal Society for Public Health (2017). “#StatusOfMind: Social media and young people’s mental health and wellbeing.” This UK study surveyed 1,500 young people ages 14-24 and ranked Instagram as the worst social media platform for mental health impact.
Postman, N. (1985). Amusing Ourselves to Death: Public Discourse in the Age of Show Business. Penguin Books. Everyone should read this. Postman gives a prophetic analysis of how television — and by extension, all visual media — reshapes public discourse toward entertainment.
Kramer, A. D. I., Guillory, J. E., & Hancock, J. T. (2014). “Experimental evidence of massive-scale emotional contagion through social networks.” Proceedings of the National Academy of Sciences, 111(24), 8788-8793.
MORE ESSAYS

Welcome to The Lab!




