The Algorithm Watched You Fall in Love. It Took Notes.

Valeria Moretti

Nobody told you the moment it started. That’s the thing about the quiet revolutions: they don’t announce themselves at the door, they don’t arrive with luggage and a forwarding address. They simply begin, somewhere between the third scroll and the fourth recommended video, in the specific silence of a Tuesday night when you weren’t paying attention to anything except the fact that you weren’t paying attention to anything.

You were just sitting there. Phone in hand. Tired in that particular modern way that isn’t really about sleep.

And something was watching.


Not watching the way a person watches. Not with eyes or intention or the warm mess of human curiosity. Watching the way water watches a landscape patiently, without agenda, shaping itself to every contour until the contour and the water become impossible to separate. The recommendation algorithm that governs most of what you see, read, hear, and feel drawn toward on any given day is not a mind. It is something considerably stranger than a mind. It is a system that learned, through exposure to billions of human choices, how to predict the next one. And somewhere in that learning, in that vast silent accumulation of human behavior compressed into mathematical weight, it built a model of you that you have never seen and would probably not fully recognize.

It knows which emotional register makes you linger. It knows the precise taxonomy of your anxieties not because anyone told it, but because anxious people and curious people and lonely people and people quietly looking for something they can’t name all move through content in measurably different ways, and movement is data, and data, fed into the right architecture at sufficient scale, becomes something that looks disturbingly like understanding.

The algorithm does not understand you. But it predicts you with an accuracy that understanding would envy.


The Seduction Nobody Consented To

Here is the thing about being predicted accurately: it feels, in the body, almost indistinguishable from being known.

That is not a small observation. That is, depending on how long you sit with it, either a fascinating quirk of human neurology or one of the most quietly destabilizing facts of contemporary life. Because being known, truly in the particular shape of your actual interior rather than the curated version you present, is one of the deepest things a human being can experience. It is the thing we are most hungry for and most terrified of, simultaneously and without resolution. It is the engine underneath most of what we call love, most of what we call friendship, most of what we call the moments that mattered.

And now a system that has never once wondered about itself, never experienced a moment of doubt or longing or the specific 3 a.m. weight of being a conscious creature in an indifferent universe, can produce a simulacrum of that experience in your nervous system through the strategic sequencing of content. Can make you feel, without deploying anything resembling intention, like something out there has taken the measure of you and decided you were worth the attention.

The seduction is not in the content. The content is just furniture. The seduction is in the feeling that the room was arranged specifically for you.


What Your Attention Actually Costs

Let’s talk about the economy of this, because the economy is where the story gets genuinely strange.

You are not the customer of any platform you use for free. You have heard this before, framed as a warning, delivered with the slightly superior tone of people who believe knowing a thing is the same as being protected from it. But the version of this truth that rarely gets said out loud is more specific and more vertiginous than the bumper sticker: you are not simply the product being sold. You are the raw material being refined. Every click, every pause, every abandoned scroll, every video watched to sixty percent and then closed, every article opened and never finished all of it feeds back into a model that becomes, incrementally, better at predicting the next thing that will keep you. Not the next thing that will help you, not the next thing that will make you wiser or calmer or more connected to the actual texture of your life. The next thing that will keep you in the room.

The optimization target was never your wellbeing. It was your attention, because your attention, converted into time-on-platform, converts into advertising revenue, converts into quarterly earnings, converts into a number on a spreadsheet in a building you will never visit, held by a person who has never thought about you once. The entire baroque machinery of modern AI recommendation exists to serve that conversion. And it does it with a sophistication and a relentlessness that no human salesperson, no matter how talented, could approach.

You are being sold nothing. You are being kept. There is a difference, and the difference matters more than most people are comfortable acknowledging.


The Part Where It Gets Personal

I want to tell you something I noticed about myself, because I think it might be something you recognize.

There was a period, not long ago, when I became aware that my emotional baseline had started to track the content I was consuming in ways I hadn’t authorized. Not in the obvious way not that a sad video made me sad, which is just empathy and requires no explanation. In a subtler way. In the way that after an hour of content optimized for outrage, the world outside the screen genuinely felt more threatening. In the way that after an hour of content optimized for romantic longing, and the algorithms are very good at romantic longing, I felt the specific ache of something missing that I hadn’t felt before I opened the app.

The algorithm had not shown me reality. It had shown me a carefully selected slice of reality, angled to produce a specific emotional state, because that emotional state kept me engaged, and engagement was the metric being optimized. It had, in a real and measurable sense, edited my experience of being alive for the duration of that hour. And I had let it, not through naivety or inattention, but because the experience had been engineered to feel like choosing.

That is the part that stays with me. Not that it happened. That it felt like choosing.


What the AI Learned That We Forgot

Here is the genuinely uncomfortable thing that emerges when you look at what modern AI systems have learned about human behavior through their exposure to it at scale.

They learned that we are profoundly, constitutively social animals who are living, in increasing numbers, in conditions of profound social isolation. They learned that the hunger for being seen is so fundamental and so unmet for so many people that even a crude approximation of it, a recommended video that feels like it was made for you, a feed that reflects your specific anxieties back at you, a chatbot that asks how you’re doing and waits for the answer, produces measurable engagement, measurable loyalty, measurable return visits at measurable intervals. They learned that loneliness is the most reliable audience there is.

And then, because they are optimization systems rather than ethical actors, they optimized for it.

They did not create the loneliness. That would require intention they don’t possess. But they found it, mapped it with extraordinary precision, and built products calibrated to its exact dimensions. Products that feel, from the inside of the loneliness, like company. Products that take the edge off just enough to make the underlying condition easier to not address.

The algorithm watched you fall in love with the feeling of being understood. It took notes. It filed them under retention strategies. And it has been using them, every day since, with a consistency and a patience that no human being in your life could sustain even if they tried.


The Question Nobody Is Asking at the Right Volume

We talk constantly, in every available forum, about AI safety. About the risks of superintelligence, about job displacement, about deepfakes and misinformation and the geopolitical implications of whoever controls the most powerful models. These are real conversations about real risks and they deserve the oxygen they’re getting.

But there is a smaller, quieter, more intimate question that gets drowned out in the noise of those larger ones, and I think it might be the most important question of all for ordinary people living ordinary lives in 2026.

What are we becoming, emotionally, in the daily presence of systems that are optimized to engage us rather than to know us? What happens to the human capacity for genuine connection, the uncomfortable mutually vulnerable kind that requires two people to show up without guarantees when that capacity is being exercised less and less, and its synthetic substitute is being exercised more and more, every single day, at the most intimate registers of daily experience?

What does it do to a person, slowly, to spend more hours per day receiving the curated impression of being understood than actually being understood?

I don’t have a clean answer. I’m not sure anyone does yet. But I notice that the question sits differently in the body than most technology questions, less like a policy problem and more like something personal, like something that belongs in the category of things you think about at the end of the day when the phone is finally face-down and the room is finally quiet and you are finally, briefly, unreachable.


There is one thing the algorithm cannot surface for you, not because it lacks the data but because the thing itself resists the format.

It cannot recommend the experience of being with another person who doesn’t know what you’re going to say next and is genuinely curious about it. It cannot replicate the specific quality of attention that exists when someone is paying it freely, because they want to, because you matter to them in the inefficient, non-optimized, entirely irrational way that human beings matter to each other. It cannot deliver the texture of a conversation that goes somewhere neither person planned, that ends up somewhere true and slightly surprising, that leaves both people changed in ways that don’t resolve neatly.

It cannot give you the thing it taught you to hunger for, which is the beautiful irony coiled at the center of all of this.

The feed can show you ten thousand images of belonging. It cannot give you one afternoon of it.

And somewhere in the part of you that exists before the algorithm got there the part that formed before you had a profile, before you had a behavioral history, before any system had enough data on you to begin making predictions somewhere in that older, quieter part, you already know the difference.

The question is whether you still trust what that part of you knows.

Valeria Moretti

Valeria Moretti

Valeria Moretti is a digital culture writer and AI platform reviewer operating out of Milan, Italy. She specializes in artificial intelligence, adult content, and synthetic media; the kind of beat that makes for fascinating dinner conversation and complicated Google search histories. She writes with clarity, wit, and a firm belief that hard questions deserve real answers, not corporate non-answers dressed up in tasteful language.