Christian Huitema's blog

Cloudy sky, waves on the sea, the sun is
shining

He said we run ads, but he didn’t say that’s why we have fake news

27 Feb 2019

"Senator, we run ads." That was Mark Zuckerberg's response to Orrin Hatch, senator from Utah, asking how Facebook could possibly sustain a business without having customer pays. The senator was then mocked, as an old guy who obviously did not have a clue about the Internet economy. Well, not so fast. Orrin Hatch is definitely not young, but you should not laugh at his question. Zuckerberg's response actually points at the root cause of the "fake news" problem. You get fake news because the algorithms that manage your news feed realize that you like them, and that brings more ad revenues. Moreover, Facebook

Back in the late 90's, the budding Internet economy had a problem. Nobody was paying for content. In theory, people were not unwilling to pay, and in fact at the time people routinely paid to get a copy of their daily newspaper. But that did not work on the Internet. If you interrupted the readers and asked them for a payment, they would typically close the page and go to another site. This is well explained in a paper by Andrew Odlyzko, "The case against micropayments". Web sites and the Silicon Valley venture capitalists behind them quickly realized that to make money, they had to rely on advertisements.

I did not find advertising particularly shocking back them. Lots of newspapers were supported by a mix of subscriptions, sales and advertisements. Having an ad banner on a web page felt very much like having a printed ad in a newspaper. But that was before Google. In the old print word, advertisers would try to place the ads near content that attracted the desired type of readers. Fishing gear would be advertised in a fishing magazine, and fancy dresses in a fashion magazine. The Internet started like that, and even Google started the same way. They would display ads for fancy dresses if you searched for fashion, and ads for fishing gear if you searched for fishing stories. The ad was still tied to the content. In restrospect, these were the good old days.

Soon after that, Google realized that they could make much more money if instead of targeting the content, they targeted the user. If they knew you were spending your weekends fishing, it did not matter whether you were currently researching references for the next term's paper. They could still feed you ads about fishing gear, and there was a pretty good chance that you would click on them. Google then theorized that the more they know about you, the more "relevant" ads they could feed you. They built an awesome infrastructure to learn as much as they could about you, measuring your web activity, your cell phone activity, and even installing gadgets to monitor your home. I imagine that the next step will be some kind of body implant. That is bringing them a lot of money.

Remember the line of the Sean Parker character in the "Social Network" movie, telling the young Zuckerberg that h should aim bigger than "a chain of very successful yogurt shops"? That's was all about ads and revenues and competing with Google. It turns out that this game of "relevant advertisements" can only be played if you are very big, if you have lots of users, and if you can observe these users frequently enough to learn a lot about them. Facebook embraced that vision, and as they say in business talks, they executed rather well. After the company was introduced on the stock exchange, the investors and Zuckerberg got very rich. But then, lofty stock prices can only be sustained if the company keeps growing.

When you have captured the vast number of users that Facebook has, further growth cannot be merely accomplished by getting more users. At some point, that number of users will reach a plateau. To keep growing revenues, you need users to spend more time on Facebook, what the business folks call "increased engagement". One way they do that by manipulating the content that they present, so you be ever more likely to read the next page, and the next, and in the process get more and more occasions to see advertisements, click on them, and bring revenues to Facebook. Google, YouTube and Twitter follow the same model. You could say that they are feeding the content that users like, much like a successful novelist would write books that his readership likes, but this is much different.

In the big platforms like Facebook, the choice of "what to show next" is driven by automated algorithms that constantly learn about you, but also about people like you. They don't just show you the next content that you will like, they aim further, like a chess master planning several moves ahead. Out of the many available contents that they know you will like to read, they know which ones are more likely to make you read the next one after that, and so on. These algorithms appear to love propaganda, because propaganda plays with your mind. The more propaganda you read, the more convinced you become, the more likely you are to read the next batch of similar stories. If you are fed a constant diet of such propaganda, you will become a true believer. For the Facebook algorithm, it does not matter whether the propaganda is about racism, or vaccines, or veganism. It also does not matter if the news is true or fake, if it keeps you engaged and clicking.

At their best, services like Facebook unite people, let them keep in touch with distant friends and family. But that alone would not bring enough revenues. Advertisements needs data about you, what comes to be known as Surveillance Capitalism. Advertisement revenues only grow if you look at the platform often. Revenue grows best if you become addicted. News from grandma will not suffice for that, but propaganda and fake news will. Like all addictions, it will sicken your mind, but Facebook's algorithms do not care. The senator's question about business models uncovers the root of the problem: yes, Facebook runs ads. But depending on ads leads to surveillance, addiction and fake news. And of course Mark Zuckerberg would rather not say that.