Evolution of the Online, part I

The sky above the port was the color of television, tuned to a dead channel

Welcome back to the newsletter, which I try to do monthly. Since our last epistle I almost died, so I missed one. Or two. But, let’s start simple: I’ve been reading.  

I’ve been reading about paranoia, psychology, conspiracy theories, and social media. So of course I’ve become convinced that there is a pattern to all this, a pattern created by a powerful group behind the scenes that has changed the way we think, using inhuman minds.

 

You’ve heard this one before: We are pattern seeking creatures, we can find patterns in random noise. We can see faces on the moon, ascribe momentous importance to the movements of a leaf on the breeze, or create stories out of moving shapes. 

 

Human beings enjoy this. There is a satisfaction in solving a puzzle, in finding the snake in a picture of leaves, in making connections and realizations. 

 

Just as we have a desire to find patterns, we have an innate drive to build identities, and to ally ourselves with a group that has the same identity. “I am an American. I am a Christian. I am from Jackson. I am a neo-reactionary Marxist-Leninist podcaster.” You can see it with sports teams or elections, both places where fans say “WE won.”

 

One natural outgrowth of this is that your group is your friend. And, it is psychologically compelling to agree with your friends. Or even better, to have them agree with you. It’s rewarding. It lets you know you have chosen correctly, befriended those who have your back, believe what you believe, even if you don’t notice that sometimes you get the causation a little bit backwards.

 

Combine these two predilections and you may come to a point: We can imagine our own groups and identities from a mass of noise. Are Pepsi-drinkers different than Dr. Pepper-drinkers? On the whole, of course not, other than some regional differences.

 

But you can make an identity, or part of an identity, around which one you like. Just ask a Texan. 

 

Facebook knows that you like making stories and observations and connecting with your friends. No one at Facebook hired psychologists and propagandists to tell them this, though. They didn’t even read my newsletter. They arrived at these conclusions in a different way, a way both familiarly organic and thoroughly unnatural. 

 

We talk about evolution here a lot. It’s important stuff. And the technology Facebook and YouTube (and others, but these two do it the best) used to learn about human psychology is called “machine learning.” 

 

Truncated explanation: Give a computer program a massive set of data and ask it to answer a question about that data using rules of probability. You could give the computer a collection of thousands of books and ask it “what word most commonly follows ‘dark?” Or you could give it a massive collection of data about what people click, what they scroll past, what makes people give a thumbs up or hit that heart button, what they reply to, what makes them go to the next post or video.

 

The computer solves the problem through something more akin to artificial selection than blind evolution: it changes a few variables, then sees which ones work inside the dataset, which changes get closer to the desired end state, and which ones need to be discarded.  Then from those changes, it makes more changes, until the wolf has become the pug.

 

However, the problem social media asked the computer to solve was not “what makes our users happy,” though you might think it would be. The problem the computer solved was “what keeps them looking at the app?” 

 

You don’t need a sophisticated machine learning algorithm to do this.  Those social media users who are the most popular know they have a brand, a style, they know what will get those likes. They relentlessly hone their posts and videos and podcasts to be more like what their fans want. The posts that are liked continue to contribute to the posters’ style, in our evolution metaphor, their material is passed down to the next post. Those posts that are boring, or posted at the wrong time of day, or are unengaging - are mistakes that are not repeated.

 

This had been going on since chat rooms. But then the machine comes into play, with a goal to increase the time people spend on the product. Corporate earning projections and growth goals have to be met, and the machine wants us to see the content that will make us stay longer. It recommends posts that it thinks will do this. 

 

Well honed posts are more liked, more shared. The machine takes note. It says “this is what keeps people on the app.” It sees a popular post and spreads it. So more people see it, more people interact with it. Hate it or love it, the machine does not care. All it knows is that people will stay online longer after they come in contact with it.

 

Much like in machine learning or evolution, the power comes from the repetition of the cycle. The poster is rewarded with the dopamine hit of pleasing their friends, or the lesser thrill of telling a friend they liked a post. The machine is rewarded with eyes on screens for just a few seconds longer, and before you know it a few million people have seen your dog ride a roomba. Or your video about how giants are real and the government is covering them up.

 

These algorithms, these artificial intelligences, quickly stumbled upon time-honed propaganda techniques. That’s not because they were told to. They have no context, they do not know what we are posting about, not really. They merely know what fires us up: what makes us watch and hit buttons.

 

What drives us to watch the product, the timeline, the playlist? That social instinct. Talking to people. Agreeing with them. Saying things you know your friends will like. Or things you know your enemies will hate. Because to the brain, the enemies of the group are your enemies, and we all know that Conan was right about what is best in life. 

 

Crushing your enemies and seeing them driven before you? That’s engagement, baby! The algorithm doesn’t care if you’re posting support because you love something or sending death threats because you hate it. It means you’re still here, still watching, still there to see the ads that pay for the yachts and billion dollar headquarters.

 

Who does the machine want you to be? It wants you to be one of the people who watch social media the most. 

 

The computer notices who spends a lot of time online. It wants to recommend that everyone watch the sort of things that the always-online watch. But not everything the always-online watch is compelling. Some of it is dense, full of in-jokes and references, of fake stories nestled in real ones, rife with shibboleths and memes.

 

Some of it is atrocious: violent, racist, paranoid. YouTube knows better than to drop you off at “Bob’s Race War Primer” after you watched “Metal Gear Boss Fight.”

 

How does it know this? It doesn’t know which video is extreme, it doesn’t understand the content, it only knows that if it plays Bob’s video after a “normal” video, you will probably close the window.

 

So let’s go back to that learning machine. Say I use it to tell me how to finish this sentence: “It was a dark…”

 

This is easier to solve than “It,” “It was,” or “It was a.” Our machine sees the beginning and knows that a lot of times the next word is “chocolate.” So it tries that. It may not work. Eventually, the machine will stumble on “It was a dark and stormy night.” But only because “It was a dark and chocolate cookie” doesn’t get likes.

 

So it builds a bridge from where you are to the place that will keep you coming back: The extremely online shit.

 

So you go from a metal gear boss fight walkthrough, to a video review of Metal Gear, to a video about how game reviews are contemptuous, to a video about the awful practices of the video game industry.

 

So far so bad. But that’s not where the algorithm wants you. That’ll keep you posting some, sure. But it knows how to keep you really hooked, hooked on the hard stuff. From the video game walkthrough to the complaints, into some 40 minute diatribe about feminism ruining video games, to an hour long video about feminism ruining EVERYTHING. 

 

Gross. You don’t want to watch that. Not yet, at least. You just came here to find out how to beat Psycho Mantis. So YouTube slowly walks you down a path to more and more outrageous things.

 

It doesn’t know why you go forward or why you don’t turn back. You don’t go back, because you’re desensitized to the more mundane videos, now. Because one of the most compelling reasons to do anything is a sense of moral outrage. If you’re outraged at the local town council you might go to a meeting you’d never sit through, or leave some angry comments on the local Facebook group.

 

The computer knows: That would be that. 

 

But if you’re outraged that feminism ruined EVERYTHING, you can post about that forever. And  that “controversial” too-online content is valuable to the algorithm. For the not-always-online set, dunking on the trolls and arguing in the comments is a potent engine for keeping people on the app. This boost in engagement, from people piling on, serves to push the inflammatory content to the top of the recommendations, ensuring that more people see it, and more people wade into the comments.

 

This wretched engagement engine, which hoards your attention and will do anything for more of it, was created by a system that learned from watching humans, without knowing what it was doing, and humans who changed their behavior to fit the machine. Iterations. The power comes from the repetition of the cycle, amplifying itself. 

 

Next time we’ll talk (ok, I’ll write, and you’ll read) about what happened when the machines began to understand what we were saying. Because eventually, they learned.

This information was used in a very profitable way. I'm sure there were no unforeseen consequences.